CodecMapper 0.1.0
Prefix Reserveddotnet add package CodecMapper --version 0.1.0
NuGet\Install-Package CodecMapper -Version 0.1.0
<PackageReference Include="CodecMapper" Version="0.1.0" />
<PackageVersion Include="CodecMapper" Version="0.1.0" />
<PackageReference Include="CodecMapper" />
paket add CodecMapper --version 0.1.0
#r "nuget: CodecMapper, 0.1.0"
#:package CodecMapper@0.1.0
#addin nuget:?package=CodecMapper&version=0.1.0
#tool nuget:?package=CodecMapper&version=0.1.0
CodecMapper
CodecMapper is a schema-first serialization library for F# focused on explicit wire contracts, symmetric encode/decode behavior, and portability to Native AOT and Fable-style targets.
It's for cases where serializer attributes and implicit conventions stop being helpful. You define one schema that mirrors the wire shape, then compile it into reusable codecs.
Why the schema feels different
open CodecMapper
open CodecMapper.Schema
type Address = { Street: string; City: string }
let makeAddress street city = { Street = street; City = city }
type Person = { Id: int; Name: string; Home: Address }
let makePerson id name home = { Id = id; Name = name; Home = home }
let addressSchema =
define<Address>
|> construct makeAddress
|> field "street" _.Street
|> field "city" _.City
|> build
let codec =
define<Person>
|> construct makePerson
|> field "id" _.Id
|> field "name" _.Name
|> fieldWith "home" _.Home addressSchema
|> Json.buildAndCompile
let person =
{
Id = 42
Name = "Ada"
Home = { Street = "Main"; City = "Adelaide" }
}
let json = Json.serialize codec person
printfn "%s" json
// {"id":42,"name":"Ada","home":{"street":"Main","city":"Adelaide"}}
let decoded = Json.deserialize codec json
printfn "%A" decoded
// { Id = 42
// Name = "Ada"
// Home = { Street = "Main"
// City = "Adelaide" } }
That schema reads almost like the data constructor:
Schema.define<Person>says which value you are describingSchema.construct makePersonsays how to rebuild it during decode- each
Schema.fieldnames one wire field and points at the matching record field Schema.fieldWithsays "this field has its own explicit child schema"
The result is not hidden serializer behavior. It is the contract itself, written in normal F#.
If the schema only exists inline at the end of the authoring pipeline, Json.buildAndCompile, Xml.buildAndCompile, Yaml.buildAndCompile, and KeyValue.buildAndCompile make that terminal step easier to scan. Keep Json.compile personSchema when the named schema is reused.
Why use it
- The schema mirrors the data, so changes to the wire contract are visible in one place.
- Encode and decode come from the same definition, so drift is harder to introduce accidentally.
Json.compileandXml.compilereuse the same schema instead of making you maintain separate mappings.- Domain refinement stays explicit through
Schema.mapandSchema.tryMapinstead of being buried in serializer settings. - Versioned message and config contracts stay deliberate because the wire shape is authored directly.
Why not just use X?
CodecMapper is not trying to replace every serializer. It is for the cases where explicit contracts matter more than convention-driven convenience.
| Option | AOT | Fable | Style | Best fit |
|---|---|---|---|---|
System.Text.Json |
Good with source generation | No | CLR-shape and attributes | General-purpose .NET serialization |
Newtonsoft.Json |
Weaker | No | CLR-shape and attributes | Flexible JSON-heavy .NET apps |
Thoth.Json |
N/A | Strong | Explicit JSON codecs | F# apps that want JSON-only explicit codecs |
Fleece / Chiron |
Varies | Limited | F# JSON mapping | F#-first JSON mapping |
| DTOs + manual mapping | Strong | Strong | Explicit but duplicated | Strict transport/domain separation |
| JSON Schema-first | Varies | Varies | External schema owned | Integrating with schema-owned systems |
CodecMapper |
Strong | Strong | Authored schema contract | Explicit message/config contracts across JSON/XML |
Use System.Text.Json when convention-based object serialization is enough. Use CodecMapper when you want the contract itself to be visible, reviewable, reusable, and stable across model evolution.
Where it fits well
CodecMapper is strongest when the wire contract matters and you want it to stay explicit.
- Message contracts: define the payload shape once and keep changes visible in the schema.
- Config contracts: treat configuration as a versioned boundary instead of incidental object serialization.
- Domain refinement: use
Schema.mapandSchema.tryMapwhen the runtime model should be stronger than the serialized shape.
How JSON Schema fits in
JSON Schema is a useful companion, but it is not the center of the library.
external schema docs
^
|
JsonSchema.generate
|
running app <-> Schema<'T> <-> Json.compile / Xml.compile
|
JsonSchema.import
|
v
external schema-owned inputs
The authored Schema<'T> is the source of truth. You do not generate app code from JSON Schema in the normal path, and JSON Schema does not replace the schema DSL. CodecMapper sits in the middle: it drives the running codecs you use in the app, and it can also project outward to formal schema documents or receive external schema-owned contracts.
- Author normal
Schema<'T>values first when you control the contract. - Export JSON Schema from those authored contracts when other systems need a formal schema document.
- Import external JSON Schema into
Schema<JsonValue>when you are receiving a dynamic or externally-owned contract.
That keeps the normal authored path simple while still giving you an integration story for external schema-driven systems.
For exact JSON Schema capabilities and fallback boundaries, see JSON Schema support reference and How to export JSON Schema.
When models evolve
One of the main benefits over convention-based serializers is that model evolution becomes explicit.
If your domain gets richer but the wire contract does not need to change yet, keep the same wire shape and refine it:
open CodecMapper.Schema
type UserId = UserId of int
module UserId =
let create value =
if value > 0 then Ok(UserId value)
else Error "UserId must be positive"
let value (UserId value) = value
type Account = { Id: UserId; Name: string }
let makeAccount id name = { Id = id; Name = name }
let userIdSchema =
int
|> tryMap UserId.create UserId.value
let accountSchema =
define<Account>
|> construct makeAccount
|> fieldWith "id" _.Id userIdSchema
|> field "name" _.Name
|> build
The JSON contract is still:
{"id":42,"name":"Ada"}
The in-memory model is stronger, but you did not need a second DTO type just to keep that contract stable.
If the wire contract really changes, the schema changes with it in one obvious place:
open CodecMapper.Schema
type PersonV2 = { Id: int; Name: string; Email: string option }
let makePersonV2 id name email = { Id = id; Name = name; Email = email }
let personV2Schema =
define<PersonV2>
|> construct makePersonV2
|> field "id" _.Id
|> field "name" _.Name
|> field "email" _.Email
|> build
That does not silently "pick up" the new field just because the record changed. You add it deliberately to the schema, so the contract review point is explicit.
Compared with DTO-heavy designs, the difference is:
- You still get an explicit wire contract.
- You do not automatically pay for duplicate transport types and mapping code.
- When you really do need a separate transport model, you can still introduce one on purpose instead of by default.
What it covers
- Core schema DSL for explicit record, collection, option, and wrapper contracts in F#
- Reusable JSON and XML codecs compiled from the same schema
- Flat key/value projection for config and environment-style contracts
- A small YAML codec for config-style mappings, sequences, and scalars
- A thin C# facade for setter-bound schema authoring and codec compilation
- A handwritten parser/runtime in the core library rather than a thin wrapper over
System.Text.Json - Built-in support for common numeric, enum, string, boolean, GUID, time-based, and collection interop types
- Explicit field-policy helpers such as
Schema.missingAsNone,Schema.missingAsValue,Schema.nullAsValue,Schema.emptyCollectionAsValue, andSchema.emptyStringAsNone - Domain refinement through
Schema.mapandSchema.tryMap - JSON Schema export from authored
Schema<'T>contracts - JSON Schema import into
Schema<JsonValue>for external dynamic receive-side contracts - Raw JSON fallback via
Schema.jsonValuefor shapes that do not lower cleanly into the normal schema subset - .NET-only bridge importers for
System.Text.Json,Newtonsoft.Json, andDataContract
Compatibility
- Shared compatibility coverage lives in
tests/CodecMapper.CompatibilitySentinel, with thin Native AOT and Fable shell apps undertests/CodecMapper.AotTestsandtests/CodecMapper.FableTests. - CI runs both the in-repo Fable sentinel and a packaged-consumer Fable transpilation check against the locally packed
CodecMapperNuGet. - The shared sentinel now includes selected invalid and out-of-range numeric cases, so the portability story covers failure behavior as well as happy-path round-trips.
Performance Work
When benchmark numbers move, profile before changing the runtime. The repo now includes a repeatable perf workflow for the manual benchmark runner in docs/HOW_TO_PROFILE_BENCHMARK_HOT_PATHS.md.
- The contract bridge in
src/CodecMapper.Bridgeis.NET-only by design; the portable surface is the core schema/JSON/XML library insrc/CodecMapper.
Docs
- Start with Getting started.
- Use the contract pattern index when you need a quick jump page.
- Copy from How to model a basic record, how to model a nested record, how to model a validated wrapper, or how to model a versioned contract.
- Use Configuration contracts guide for versioned config shapes.
- Use How to export JSON Schema and JSON Schema support reference for schema interchange.
- Use How to import existing C# contracts for the bridge/facade story.
- Browse the API docs.
Benchmarks
CodecMapper still carries benchmark coverage and comparison runners. The published numbers should be read as machine-specific snapshots, not universal claims.
For quick local comparisons, use the manual Release runner:
dotnet run -c Release --project benchmarks/CodecMapper.Benchmarks.Runner/CodecMapper.Benchmarks.Runner.fsproj
For BenchmarkDotNet output, use:
dotnet run -c Release --project benchmarks/CodecMapper.Benchmarks/CodecMapper.Benchmarks.fsproj
The benchmark suite compares CodecMapper JSON encode/decode against System.Text.Json and Newtonsoft.Json across a deterministic scenario matrix that covers small messages, nested-record batches, string-heavy payloads, numeric-heavy telemetry, and decode paths with ignored unknown fields.
Latest local manual scenario-matrix snapshot, measured on March 11, 2026.
The manual runner now covers six deterministic workloads:
small-message: one shallow command-sized objectperson-batch-25: medium nested-record API-style batchperson-batch-250: larger nested-record throughput batchescaped-articles-20: string-heavy records with escapes and nested authorstelemetry-500: numeric-heavy objects with float, decimal, and wider integersperson-batch-25-unknown-fields: receive-side decode with ignored extra fields
Headline observations from the latest local run:
- The latest optimization pass moved
CodecMapperahead onsmall-messageserialize (1.87 usvs2.25 us) while keeping tiny-message decode in the same general range. CodecMapperstayed effectively even withSystem.Text.Jsononperson-batch-25deserialize (94.6 usvs94.7 us) and remained competitive onperson-batch-250serialize (390.8 usvs370.8 us).System.Text.Jsonstill leads on the string-heavyescaped-articles-20workload, especially on deserialize.System.Text.Jsonalso still leads the largest numeric-heavytelemetry-500case, which means the JSON runtime still has meaningful throughput and allocation work left on wide numeric batches.- The unknown-field decode path improved, but
System.Text.Jsonstill holds a modest lead onperson-batch-25-unknown-fieldsdeserialize (125.8 usvs132.5 us). - Both
CodecMapperandSystem.Text.Jsonstayed well ahead ofNewtonsoft.Jsonacross every workload in this local matrix.
These numbers came from:
dotnet run -c Release --project benchmarks/CodecMapper.Benchmarks.Runner/CodecMapper.Benchmarks.Runner.fsproj
Notes
Json.compileis explicit by design. Compile once and reuse the resulting codec.- The current benchmark numbers are machine-specific and published mainly as relative comparisons, not universal claims.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- FSharp.Core (>= 10.0.103)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 0.1.0 | 29 | 3/11/2026 |
First public preview release of CodecMapper with schema DSL, shared JSON/XML codecs, YAML and key/value projections, JSON Schema import/export, and the .NET bridge package in the repo.