NTokenizers 3.0.0
See the version list below for details.
dotnet add package NTokenizers --version 3.0.0
NuGet\Install-Package NTokenizers -Version 3.0.0
<PackageReference Include="NTokenizers" Version="3.0.0" />
<PackageVersion Include="NTokenizers" Version="3.0.0" />
<PackageReference Include="NTokenizers" />
paket add NTokenizers --version 3.0.0
#r "nuget: NTokenizers, 3.0.0"
#:package NTokenizers@3.0.0
#addin nuget:?package=NTokenizers&version=3.0.0
#tool nuget:?package=NTokenizers&version=3.0.0
NTokenizers
Collection of stream-capable tokenizers for Markdown, JSON, XML, YAML, SQL, Typescript and CSharp processing.
Kickoff token processing
// kickoff markdown tokenizer
await MarkdownTokenizer.Create().ParseAsync(stream, onToken: async token => { /* handle markdown-tokens here */ });
// kickoff csharp tokenizer
await CSharpTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle csharp-tokens here */ });
// kickoff json tokenizer
await JsonTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle json-tokens here */ });
// kickoff sql tokenizer
await SqlTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle sql-tokens here */ });
// kickoff typescript tokenizer
await TypescriptTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle typescript-tokens here */ });
// kickoff css tokenizer
await CssTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle css-tokens here */ });
// kickoff xml tokenizer
await XmlTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle xml-tokens here */ });
// kickoff yaml tokenizer
await YamlTokenizer.Create().ParseAsync(stream, onToken: token => { /* handle yaml-tokens here */ });
Overview
NTokenizers is a .NET library written in C# that provides tokenizers for processing structured text formats like Markdown, JSON, XML, YAML, SQL, Typescript, CSS and CSharp. The Tokenize method is the core functionality that breaks down structured text into meaningful components (tokens) for processing. Its key feature is stream processing capability - it can handle data as it arrives in real-time, making it ideal for processing large files or streaming data without loading everything into memory at once.
These tokenizers are not validation-based and are primarily intended for prettifying, formatting, or visualizing structured text. They do not perform strict validation of the input format, so they may produce unexpected results when processing malformed or invalid XML, JSON, or HTML. Use them with caution when dealing with untrusted or poorly formatted input.
MarkupTokenizer was renamed to MarkdownTokenizer in v2.
Used by
- NTokenizers.Extensions.Spectre.Console Spectre.Console rendering extensions for NTokenizers, Style-rich console syntax highlighting.
Architecture
Most tokenizers, such as json, xml, or etc..., can be used individually, depending on the specific format you want to parse.
The MarkdownTokenizer however is a special case. Instead of working on a single format, it acts as a composite tokenizer, using the other tokenizers as subtokenizers. When parsing a stream, MarkdownTokenizer delegates portions of the input to the appropriate subtokenizer, allowing it to handle multiple formats seamlessly in one pass.
The same principle applies to inline tokenizers such as Heading, Blockquote, ListItem, and others. However, they cannot be used individually and produce the same token types as the MarkdownTokenizer.
Diagram
┌─────────┐
│ stream │
└─────────┘
│ ParseAsync()
▼
┌─────────────────────┐
│ MarkdownTokenizer │ ───────────► fire markdown tokens
└─────────────────────┘
│
▼ ┌─────────┐
├──────►│ json │ ───► fire json tokens
│ └─────────┘
│
│ ┌─────────┐
├──────►│ Heading │ ───► fire markdown tokens
│ └─────────┘
│
│ ┌─────────┐
└──────►│ etc.. │ ───► etc
└─────────┘
Example
Here's a simple example showing how to use the MarkdownTokenizer:
using NTokenizers.Json;
using NTokenizers.Markdown;
using NTokenizers.Markdown.Metadata;
using NTokenizers.Typescript;
using NTokenizers.Xml;
using Spectre.Console;
using System.IO.Pipes;
using System.Text;
class Program
{
static async Task Main()
{
string markdown = """
Here is some **bold** text and some *italic* text.
# NTokenizers Showcase
## XML example
```xml
<user id="4821" active="true">
<name>Laura Smith</name>
</user>
```
## JSON example
```json
{
"name": "Laura Smith",
"active": true
}
```
## TypeScript example
```typescript
const user = {
name: "Laura Smith",
active: true
};
```
""";
// Create connected streams
using var pipe = new AnonymousPipeServerStream(PipeDirection.Out);
using var stream = new AnonymousPipeClientStream(PipeDirection.In, pipe.ClientSafePipeHandle);
// Start slow writer
var writerTask = EmitSlowlyAsync(markdown, pipe);
// Parse markdown
await MarkdownTokenizer.Create().ParseAsync(stream, onToken: async token =>
{
if (token.Metadata is HeadingMetadata headingMetadata)
{
await headingMetadata.RegisterInlineTokenHandler(inlineToken =>
{
var value = Markup.Escape(inlineToken.Value);
var colored = headingMetadata.Level != 1 ?
new Markup($"[bold GreenYellow]{value}[/]") :
new Markup($"[bold yellow]** {value} **[/]");
AnsiConsole.Write(colored);
});
}
else if (token.Metadata is XmlCodeBlockMetadata xmlMetadata)
{
await xmlMetadata.RegisterInlineTokenHandler(inlineToken =>
{
var value = Markup.Escape(inlineToken.Value);
var colored = inlineToken.TokenType switch
{
XmlTokenType.ElementName => new Markup($"[blue]{value}[/]"),
XmlTokenType.EndElement => new Markup($"[blue]{value}[/]"),
XmlTokenType.OpeningAngleBracket => new Markup($"[yellow]{value}[/]"),
XmlTokenType.ClosingAngleBracket => new Markup($"[yellow]{value}[/]"),
XmlTokenType.SelfClosingSlash => new Markup($"[yellow]{value}[/]"),
XmlTokenType.AttributeName => new Markup($"[cyan]{value}[/]"),
XmlTokenType.AttributeEquals => new Markup($"[yellow]{value}[/]"),
XmlTokenType.AttributeQuote => new Markup($"[grey]{value}[/]"),
XmlTokenType.AttributeValue => new Markup($"[green]{value}[/]"),
XmlTokenType.Text => new Markup($"[white]{value}[/]"),
XmlTokenType.Whitespace => new Markup($"[grey]{value}[/]"),
_ => new Markup(value)
};
AnsiConsole.Write(colored);
});
}
else if (token.Metadata is JsonCodeBlockMetadata jsonMetadata)
{
await jsonMetadata.RegisterInlineTokenHandler(inlineToken =>
{
var value = Markup.Escape(inlineToken.Value);
var colored = inlineToken.TokenType switch
{
JsonTokenType.StartObject => new Markup($"[yellow]{value}[/]"),
JsonTokenType.EndObject => new Markup($"[yellow]{value}[/]"),
JsonTokenType.StartArray => new Markup($"[yellow]{value}[/]"),
JsonTokenType.EndArray => new Markup($"[yellow]{value}[/]"),
JsonTokenType.PropertyName => new Markup($"[cyan]{value}[/]"),
JsonTokenType.StringValue => new Markup($"[green]{value}[/]"),
JsonTokenType.Number => new Markup($"[magenta]{value}[/]"),
JsonTokenType.True => new Markup($"[orange1]{value}[/]"),
JsonTokenType.False => new Markup($"[orange1]{value}[/]"),
JsonTokenType.Null => new Markup($"[grey]{value}[/]"),
JsonTokenType.Colon => new Markup($"[yellow]{value}[/]"),
JsonTokenType.Comma => new Markup($"[yellow]{value}[/]"),
JsonTokenType.Whitespace => new Markup($"[grey]{value}[/]"),
_ => new Markup(value)
};
AnsiConsole.Write(colored);
});
}
else if (token.Metadata is TypeScriptCodeBlockMetadata tsMetadata)
{
await tsMetadata.RegisterInlineTokenHandler(inlineToken =>
{
var value = Markup.Escape(inlineToken.Value);
var colored = inlineToken.TokenType switch
{
TypescriptTokenType.Identifier => new Markup($"[cyan]{value}[/]"),
TypescriptTokenType.Keyword => new Markup($"[blue]{value}[/]"),
TypescriptTokenType.StringValue => new Markup($"[green]{value}[/]"),
TypescriptTokenType.Number => new Markup($"[magenta]{value}[/]"),
TypescriptTokenType.Operator => new Markup($"[yellow]{value}[/]"),
TypescriptTokenType.Comment => new Markup($"[grey]{value}[/]"),
TypescriptTokenType.Whitespace => new Markup($"[grey]{value}[/]"),
_ => new Markup(value)
};
AnsiConsole.Write(colored);
});
}
else
{
// Handle regular markdown tokens
var value = Markup.Escape(token.Value);
var colored = token.TokenType switch
{
MarkdownTokenType.Text => new Markup($"{value}"),
MarkdownTokenType.Bold => new Markup($"[bold]{value}[/]"),
MarkdownTokenType.Italic => new Markup($"[italic]{value}[/]"),
_ => new Markup(value)
};
AnsiConsole.Write(colored);
}
if (token.Metadata is InlineMarkdownMetadata)
{
AnsiConsole.WriteLine();
}
});
await writerTask;
Console.WriteLine();
Console.WriteLine("Done.");
}
static async Task EmitSlowlyAsync(string markdown, Stream output)
{
var rng = new Random();
byte[] bytes = Encoding.UTF8.GetBytes(markdown);
foreach (var b in bytes)
{
await output.WriteAsync(new[] { b }.AsMemory(0, 1));
await output.FlushAsync();
await Task.Delay(rng.Next(2, 8));
}
output.Close(); // EOF
}
}
For more information, check out the documentation here.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- No dependencies.
NuGet packages (1)
Showing the top 1 NuGet packages that depend on NTokenizers:
| Package | Downloads |
|---|---|
|
NTokenizers.Extensions.Spectre.Console
Stream-capable Spectre.Console rendering extensions for NTokenizers (XML, JSON, Markdown, TypeScript, CSS, C# and SQL), Style-rich console syntax highlighting |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 4.0.0 | 27 | 1/14/2026 |
| 3.0.0 | 134 | 1/4/2026 |
| 2.2.0 | 85 | 1/3/2026 |
| 2.1.0 | 303 | 12/11/2025 |
| 2.0.0 | 417 | 12/11/2025 |
| 1.0.1 | 436 | 12/9/2025 |
| 1.0.0 | 388 | 12/8/2025 |
| 0.8.0-preview | 165 | 12/6/2025 |
| 0.7.0-preview | 122 | 12/6/2025 |
| 0.6.0-preview | 708 | 12/3/2025 |
| 0.5.0-preview | 665 | 12/3/2025 |
| 0.4.0-preview | 663 | 12/2/2025 |
| 0.3.0-preview | 698 | 12/2/2025 |
| 0.2.0-preview | 185 | 11/27/2025 |
| 0.1.0-preview | 394 | 11/19/2025 |
Added CSS Tokenizer Call in Markdown