ODataToSql 2.0.13
dotnet add package ODataToSql --version 2.0.13
NuGet\Install-Package ODataToSql -Version 2.0.13
<PackageReference Include="ODataToSql" Version="2.0.13" />
<PackageVersion Include="ODataToSql" Version="2.0.13" />
<PackageReference Include="ODataToSql" />
paket add ODataToSql --version 2.0.13
#r "nuget: ODataToSql, 2.0.13"
#:package ODataToSql@2.0.13
#addin nuget:?package=ODataToSql&version=2.0.13
#tool nuget:?package=ODataToSql&version=2.0.13
ODataToSql
High-performance .NET library that converts OData V4 queries into optimized SQL statements for multiple database dialects including SQL Server, PostgreSQL, MySQL, Athena, Redshift, and Snowflake.
Installation
dotnet add package ODataToSql
Or via NuGet Package Manager:
Install-Package ODataToSql
Quick Start
Unified API (Recommended)
The ODataToSqlConverter.ConvertODataToSql method provides a simplified, unified API for converting OData queries to SQL:
using ODataToSql;
using ODataToSql.Models;
// Define your dataset metadata
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Customers",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "CustomerId",
Columns = new List<ColumnDto>
{
new() { Name = "CustomerId", DataType = "int32", Nullable = false },
new() { Name = "Name", DataType = "string", Nullable = false },
new() { Name = "Email", DataType = "string", Nullable = false },
new() { Name = "Age", DataType = "int32", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "dbo", Table = "Customers" }
}
}
};
// Convert OData query to SQL
var result = ODataToSqlConverter.ConvertODataToSql(
datasetsMetadata: datasets,
oDataQuery: "$select=Name,Email&$filter=Age gt 18&$orderby=Name asc&$top=10",
dialect: "SqlServer",
useTableAliasName: true
);
// Use the generated SQL
Console.WriteLine(result.SqlStatement);
// Output: SELECT t0.Name, t0.Email FROM dbo.Customers AS t0
// WHERE t0.Age > @p0 ORDER BY t0.Name ASC
// OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY
// Access parameters for safe execution
foreach (var param in result.Parameters)
{
Console.WriteLine($"{param.Key} = {param.Value}");
}
// Output: @p0 = 18
Alternative: Direct EntityMetadata API
For scenarios where you have pre-built EntityMetadata:
using ODataToSql;
using ODataToSql.Core;
// Define entity metadata directly
var metadata = new EntityMetadata
{
Name = "Customers",
TableName = "Customers",
SchemaName = "dbo",
PrimaryKey = "CustomerId",
Columns = new Dictionary<string, ColumnMetadata>
{
["CustomerId"] = new ColumnMetadata { Name = "CustomerId", DataType = typeof(int), IsNullable = false },
["Name"] = new ColumnMetadata { Name = "Name", DataType = typeof(string), IsNullable = false },
["Age"] = new ColumnMetadata { Name = "Age", DataType = typeof(int), IsNullable = false }
}
};
// Convert using EntityMetadata
var result = ODataToSqlConverter.ConvertODataToSql(
entityMetadata: metadata,
oDataQuery: "$select=Name,Age&$filter=Age gt 18&$orderby=Name asc",
dialect: "SqlServer",
useTableAliasName: true
);
SQL Generator Options
The library provides two SQL generation engines:
Default SQL Generator (Recommended)
var result = ODataToSqlConverter.ConvertODataToSql(
metadata,
"$select=Name,Age&$filter=Age gt 18",
"SqlServer",
useTableAliasName: true,
sqlCompiler: "sql" // Default - recommended for production
);
SqlKata Query Builder (Alternative)
Uses SqlKata for query building abstraction:
var result = ODataToSqlConverter.ConvertODataToSql(
metadata,
"$select=Name,Age&$filter=Age gt 18",
"SqlServer",
useTableAliasName: true,
sqlCompiler: "sqlkata" // Alternative implementation
);
When to use:
- Default SQL Generator: Production environments, complex queries, best performance (recommended)
- SqlKata Generator: Prototyping, simple queries, query builder abstraction preference
Features
- OData V4 Query Support: Full support for
$select,$filter,$orderby,$top,$skip,$expand,$apply, and$compute - Multiple SQL Dialects: SQL Server, PostgreSQL, MySQL, SQLite, ANSI SQL, Athena, Redshift, and Snowflake
- Parameterized Queries: Automatic SQL injection protection through parameterization with dialect-specific formats
- Multi-Level Expands: Handle complex nested entity relationships with unlimited depth
- Aggregate Transformations: Support for
$applywith groupby, aggregate, filter, and compute operations - Intelligent Filter Placement: Automatic WHERE vs HAVING clause detection with dialect-specific optimizations:
- Athena: Expands aggregate and computed expressions inline in HAVING clause for full compatibility
- Other Dialects: Uses cleaner column aliases in HAVING clause for better readability
- Computed Column References: Supports computed columns that reference other computed columns with recursive expansion
- Type Safety: Strong typing with compile-time validation
- High Performance: Optimized parsing and flat SQL generation with JOINs instead of subqueries
- Rich Function Library: Complete support for string, date/time, and math functions per OData V4 spec
- Date/Time Literals: ISO 8601 date, datetime, and datetimeoffset literals with timezone support
- Comprehensive Error Handling: Detailed validation errors with helpful messages
SQL Dialect Examples
The library supports 8 SQL dialects with dialect-specific parameter formatting and syntax:
SQL Server
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10",
"SqlServer",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > @p0
// ORDER BY Age
// OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY
// Parameters: { "@p0": 18 }
PostgreSQL
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10",
"PostgreSql",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > $1
// LIMIT 10
// Parameters: { "$1": 18 }
MySQL
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10",
"MySql",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > ?
// LIMIT 10
// Parameters: { "p0": 18 }
Amazon Athena
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10&$skip=5",
"Athena",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > :p0
// OFFSET 5 LIMIT 10
// Parameters: { ":p0": 18 }
// Note: Athena uses OFFSET...LIMIT order (opposite of standard SQL)
// Note: Athena expands aggregate expressions inline in HAVING clauses for compatibility
Amazon Redshift
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10",
"Redshift",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > :p0
// LIMIT 10
// Parameters: { ":p0": 18 }
Snowflake
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=Name,Age&$filter=Age gt 18&$top=10",
"Snowflake",
useTableAliasName: false
);
// Generated SQL:
// SELECT Name, Age FROM Customers
// WHERE Age > :p0
// LIMIT 10
// Parameters: { ":p0": 18 }
Parameter Format by Dialect
| Dialect | SQL Placeholder | Dictionary Key | Example |
|---|---|---|---|
| SQL Server | @p0, @p1 |
@p0, @p1 |
WHERE Age > @p0 |
| PostgreSQL | $1, $2 |
$1, $2 |
WHERE Age > $1 |
| MySQL | ? |
p0, p1 |
WHERE Age > ? |
| SQLite | ? |
p0, p1 |
WHERE Age > ? |
| Athena | :p0, :p1 |
:p0, :p1 |
WHERE Age > :p0 |
| Redshift | :p0, :p1 |
:p0, :p1 |
WHERE Age > :p0 |
| Snowflake | :p0, :p1 |
:p0, :p1 |
WHERE Age > :p0 |
| AnsiSQL | ? |
p0, p1 |
WHERE Age > ? |
Notes:
- MySQL, SQLite, and ANSI SQL use
?placeholders with positional binding (dictionary keys:p0,p1, etc.) - Athena, Redshift, and Snowflake use
:p0,:p1named parameters - SQL Server uses
@p0,@p1named parameters - PostgreSQL uses
$1,$2positional parameters (1-indexed)
Advanced Features
WHERE vs HAVING Clause (Intelligent Filter Placement)
The library automatically determines whether filters should go in the WHERE clause (for regular columns) or HAVING clause (for aggregate/computed columns), with dialect-specific optimizations:
Basic Aggregate Filter
OData Query:
$apply=groupby((Country), aggregate(People with sum as Total))&$filter=Total gt 100
Athena (expands expressions inline):
SELECT "Country", SUM("People") AS "Total"
FROM "Employees"
GROUP BY "Country"
HAVING SUM("People") > :p0
Redshift/Snowflake (uses column aliases):
SELECT "Country", SUM("People") AS "Total"
FROM "Employees"
GROUP BY "Country"
HAVING "Total" > :p0
SQL Server (uses column aliases):
SELECT [Country], SUM([People]) AS [Total]
FROM [Employees]
GROUP BY [Country]
HAVING [Total] > @p0
Computed Column Filter
OData Query:
$apply=groupby((Country), aggregate(People with sum as Total))/compute(Total add 1 as Total1)&$filter=Total1 gt 100
Athena:
SELECT "Country", SUM("People") AS "Total", (SUM("People") + :p0) AS "Total1"
FROM "Employees"
GROUP BY "Country"
HAVING ((SUM("People") + :p1) > :p2)
SQL Server:
SELECT [Country], SUM([People]) AS [Total], ([Total] + @p0) AS [Total1]
FROM [Employees]
GROUP BY [Country]
HAVING [Total1] > @p1
Mixed Filters (WHERE + HAVING)
OData Query:
$apply=groupby((Country, City), aggregate(Salary with sum as TotalSalary))&$filter=Country eq 'USA' and TotalSalary gt 100000
SQL Server:
SELECT [Country], [City], SUM([Salary]) AS [TotalSalary]
FROM [Employees]
WHERE [Country] = @p0
GROUP BY [Country], [City]
HAVING [TotalSalary] > @p1
Athena:
SELECT "Country", "City", SUM("Salary") AS "TotalSalary"
FROM "Employees"
WHERE "Country" = :p0
GROUP BY "Country", "City"
HAVING SUM("Salary") > :p1
Multi-Level Relationships with $expand
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Customers",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "CustomerId",
Columns = new List<ColumnDto>
{
new() { Name = "CustomerId", DataType = "int32", Nullable = false },
new() { Name = "CustomerName", DataType = "string", Nullable = false }
}
}
},
new DatasetMetadataDto
{
Name = "Orders",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "OrderId",
Columns = new List<ColumnDto>
{
new() { Name = "OrderId", DataType = "int32", Nullable = false },
new() { Name = "CustomerId", DataType = "int32", Nullable = false },
new() { Name = "OrderTotal", DataType = "decimal", Nullable = false }
}
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=CustomerName&$expand=Orders($select=OrderTotal;$filter=OrderTotal gt 100)",
"SqlServer",
useTableAliasName: true
);
// Generated SQL:
// SELECT t0.CustomerName, t1.OrderTotal
// FROM Customers AS t0
// LEFT JOIN Orders AS t1 ON t0.CustomerId = t1.CustomerId
// WHERE t1.OrderTotal > @p0
Date/Time Functions
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=year(BirthDate) eq 2000 and month(BirthDate) eq 1",
"SqlServer",
useTableAliasName: false
);
// Generated SQL:
// SELECT * FROM Customers
// WHERE (YEAR(BirthDate) = @p0) AND (MONTH(BirthDate) = @p1)
// Parameters: { "@p0": 2000, "@p1": 1 }
String Functions
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=contains(Name, 'John') and length(Email) gt 10",
"PostgreSql",
useTableAliasName: false
);
// Generated SQL:
// SELECT * FROM Customers
// WHERE (Name LIKE '%' || $1 || '%') AND (LENGTH(Email) > $2)
// Parameters: { "$1": "John", "$2": 10 }
Math Functions
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=round(Price) gt 100 and ceiling(Discount) lt 20",
"MySql",
useTableAliasName: false
);
// Generated SQL:
// SELECT * FROM Products
// WHERE (ROUND(Price, 0) > ?) AND (CEILING(Discount) < ?)
// Parameters: { "p0": 100, "p1": 20 }
Computed Columns with References
The library supports computed columns that reference other computed columns:
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$compute=total_site_acres add 1 as c1, c1 sub 20 as c2&$select=c1,c2",
"Athena",
useTableAliasName: true
);
// Generated SQL (Athena - expands inline):
// SELECT (t0."total_site_acres" + :p0) AS "c1",
// ((t0."total_site_acres" + :p0) - :p1) AS "c2"
// FROM "DATA_CAMPUS" AS t0
// Parameters: { ":p0": 1, ":p1": 20 }
Key Features:
- Computed columns can reference other computed columns
- Expressions are recursively expanded inline
- Consistent behavior across all SQL dialects
## Complete Usage Examples
### Example 1: Simple Filter and Sort (SQL Server)
```csharp
using ODataToSql;
using ODataToSql.Models;
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Products",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "ProductId",
Columns = new List<ColumnDto>
{
new() { Name = "ProductId", DataType = "int32", Nullable = false },
new() { Name = "ProductName", DataType = "string", Nullable = false },
new() { Name = "Price", DataType = "decimal", Nullable = false },
new() { Name = "InStock", DataType = "boolean", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "inventory", Table = "Products" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=ProductName,Price&$filter=InStock eq true&$orderby=Price desc&$top=10",
"SqlServer",
useTableAliasName: false
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT ProductName, Price
// FROM inventory.Products
// WHERE InStock = @p0
// ORDER BY Price DESC
// OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY
Console.WriteLine($"Parameters: {string.Join(", ", result.Parameters.Select(p => $"{p.Key}={p.Value}"))}");
// Output: Parameters: @p0=True
Example 2: Aggregation with GroupBy (PostgreSQL)
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Sales",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "SaleId",
Columns = new List<ColumnDto>
{
new() { Name = "SaleId", DataType = "int32", Nullable = false },
new() { Name = "Country", DataType = "string", Nullable = false },
new() { Name = "Revenue", DataType = "decimal", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "public", Table = "Sales" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$apply=groupby((Country), aggregate(Revenue with sum as TotalRevenue))&$filter=TotalRevenue gt 10000",
"PostgreSql",
useTableAliasName: false
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT Country, SUM(Revenue) AS TotalRevenue
// FROM public.Sales
// GROUP BY Country
// HAVING TotalRevenue > $1
Console.WriteLine($"Parameters: {string.Join(", ", result.Parameters.Select(p => $"{p.Key}={p.Value}"))}");
// Output: Parameters: $1=10000
Example 3: Multi-Level Expand (Athena)
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Customers",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "CustomerId",
Columns = new List<ColumnDto>
{
new() { Name = "CustomerId", DataType = "int32", Nullable = false },
new() { Name = "CustomerName", DataType = "string", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "sales", Table = "customers" }
}
},
new DatasetMetadataDto
{
Name = "Orders",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "OrderId",
Columns = new List<ColumnDto>
{
new() { Name = "OrderId", DataType = "int32", Nullable = false },
new() { Name = "CustomerId", DataType = "int32", Nullable = false },
new() { Name = "OrderDate", DataType = "datetime", Nullable = false },
new() { Name = "Total", DataType = "decimal", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "sales", Table = "orders" }
}
},
new DatasetMetadataDto
{
Name = "OrderItems",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "OrderItemId",
Columns = new List<ColumnDto>
{
new() { Name = "OrderItemId", DataType = "int32", Nullable = false },
new() { Name = "OrderId", DataType = "int32", Nullable = false },
new() { Name = "ProductName", DataType = "string", Nullable = false },
new() { Name = "Quantity", DataType = "int32", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "sales", Table = "order_items" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$select=CustomerName&$expand=Orders($select=OrderDate,Total;$expand=OrderItems($select=ProductName,Quantity))",
"Athena",
useTableAliasName: true
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT t0."CustomerName", t1."OrderDate", t1."Total", t2."ProductName", t2."Quantity"
// FROM sales.customers AS t0
// LEFT JOIN sales.orders AS t1 ON t0."CustomerId" = t1."CustomerId"
// LEFT JOIN sales.order_items AS t2 ON t1."OrderId" = t2."OrderId"
Example 4: Computed Columns with Aggregates (Redshift)
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Employees",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "EmployeeId",
Columns = new List<ColumnDto>
{
new() { Name = "EmployeeId", DataType = "int32", Nullable = false },
new() { Name = "Department", DataType = "string", Nullable = false },
new() { Name = "Salary", DataType = "decimal", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "hr", Table = "employees" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$apply=groupby((Department), aggregate(Salary with sum as TotalSalary, Salary with avg as AvgSalary))/compute(TotalSalary div AvgSalary as Ratio)&$filter=Ratio gt 10",
"Redshift",
useTableAliasName: false
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT Department, SUM(Salary) AS TotalSalary, AVG(Salary) AS AvgSalary, (TotalSalary / AvgSalary) AS Ratio
// FROM hr.employees
// GROUP BY Department
// HAVING Ratio > :p0
Console.WriteLine($"Parameters: {string.Join(", ", result.Parameters.Select(p => $"{p.Key}={p.Value}"))}");
// Output: Parameters: :p0=10
Example 5: Date/Time Filtering (Snowflake)
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Events",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "EventId",
Columns = new List<ColumnDto>
{
new() { Name = "EventId", DataType = "int32", Nullable = false },
new() { Name = "EventName", DataType = "string", Nullable = false },
new() { Name = "EventDate", DataType = "datetime", Nullable = false },
new() { Name = "CreatedAt", DataType = "datetimeoffset", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "public", Table = "events" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=year(EventDate) eq 2024 and CreatedAt gt 2024-01-01T00:00:00Z&$orderby=EventDate desc",
"Snowflake",
useTableAliasName: false
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT *
// FROM public.events
// WHERE (YEAR(EventDate) = :p0) AND (CreatedAt > '2024-01-01 00:00:00'::timestamp)
// ORDER BY EventDate DESC
Console.WriteLine($"Parameters: {string.Join(", ", result.Parameters.Select(p => $"{p.Key}={p.Value}"))}");
// Output: Parameters: :p0=2024
Example 6: String Functions (MySQL)
var datasets = new List<DatasetMetadataDto>
{
new DatasetMetadataDto
{
Name = "Users",
DatasetSchema = new DatasetSchemaDto
{
PrimaryKey = "UserId",
Columns = new List<ColumnDto>
{
new() { Name = "UserId", DataType = "int32", Nullable = false },
new() { Name = "FirstName", DataType = "string", Nullable = false },
new() { Name = "LastName", DataType = "string", Nullable = false },
new() { Name = "Email", DataType = "string", Nullable = false }
}
},
Locations = new List<LocationDto>
{
new() { Schema = "app", Table = "users" }
}
}
};
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=contains(Email, 'gmail.com') and length(FirstName) gt 3&$compute=concat(FirstName, LastName) as FullName&$select=FullName,Email",
"MySql",
useTableAliasName: false
);
Console.WriteLine(result.SqlStatement);
// Output:
// SELECT CONCAT(FirstName, LastName) AS FullName, Email
// FROM app.users
// WHERE (Email LIKE CONCAT('%', ?, '%')) AND (LENGTH(FirstName) > ?)
Console.WriteLine($"Parameters: {string.Join(", ", result.Parameters.Select(p => $"{p.Key}={p.Value}"))}");
// Output: Parameters: p0=gmail.com, p1=3
Error Handling
The unified API provides detailed error information for debugging:
try
{
var result = ODataToSqlConverter.ConvertODataToSql(
datasets,
"$filter=NonExistentColumn eq 'test'",
"SqlServer",
useTableAliasName: false
);
}
catch (ConversionFailedException ex)
{
Console.WriteLine($"Conversion failed at stage: {ex.Stage}");
Console.WriteLine($"Original query: {ex.ODataQuery}");
if (ex.ValidationErrors != null && ex.ValidationErrors.Any())
{
Console.WriteLine("Validation errors:");
foreach (var error in ex.ValidationErrors)
{
Console.WriteLine($" - {error.Message} (Location: {error.Location})");
}
}
if (ex.ErrorPosition.HasValue)
{
Console.WriteLine($"Error at position: {ex.ErrorPosition.Value}");
}
}
// Output:
// Conversion failed at stage: Validation
// Original query: $filter=NonExistentColumn eq 'test'
// Validation errors:
// - Column 'NonExistentColumn' does not exist in entity 'Products' (Location: $filter)
Supported OData Operations
| Operation | Description | Example |
|---|---|---|
$filter |
Filter results with complex expressions | $filter=Age gt 18 and contains(Name, 'John') |
$select |
Select specific fields | $select=Name,Email |
$orderby |
Sort results (asc/desc) | $orderby=Name desc,Age asc |
$top |
Limit results | $top=10 |
$skip |
Skip results for pagination | $skip=20 |
$expand |
Include related entities with nested options | $expand=Orders($select=Total;$filter=Total gt 100) |
$apply |
Aggregate transformations | $apply=groupby((Category), aggregate(Price with sum as Total)) |
$compute |
Calculated columns | $compute=Price mul Quantity as Total |
Supported OData V4 Functions
String Functions
contains,startswith,endswith,length,indexof,substring,tolower,toupper,trim,concat
Date/Time Functions
year,month,day,hour,minute,second,date,time,now,mindatetime,maxdatetime,totaloffsetminutes,fractionalseconds
Math Functions
round,floor,ceiling
Aggregate Functions
sum,average,min,max,count
Type Functions
cast
Supported SQL Dialects
- SQL Server - Full T-SQL support with OFFSET/FETCH pagination
- PostgreSQL - Complete PostgreSQL syntax with $1, $2 parameters
- MySQL - MySQL/MariaDB with ? placeholders
- SQLite - SQLite syntax with ? placeholders
- Amazon Athena - Presto/Trino-based with OFFSET...LIMIT order
- Amazon Redshift - Redshift-specific optimizations
- Snowflake - Snowflake syntax and functions
- ANSI SQL - Standard SQL for maximum compatibility
Migration Guide
Upgrading from Previous Versions
If you're using the direct QueryConverter API, no changes are required. The library maintains full backward compatibility.
To take advantage of new orchestration features:
Before (still works):
var converter = new QueryConverter(SqlDialect.PostgreSql);
var result = converter.Convert(odataQuery, metadata);
After (recommended for production):
var context = new ConversionContext
{
ODataQuery = odataQuery,
EntityName = "MyEntity",
Metadata = metadata,
Dialect = "postgresql"
};
var result = await QueryConverterExtensions.ConvertAsync(context);
if (result.IsSuccess)
{
var sqlResult = (SqlResult)result.Data;
// Use sqlResult.SqlStatement and sqlResult.Parameters
}
Benefits of Orchestration API
- Error Handling: Errors are returned as structured data instead of exceptions
- Validation: Built-in request validation before conversion
- Logging: Optional logging callbacks for diagnostics
- Flexibility: Support for dynamic metadata fetching and data execution
- Consistency: Uniform error format across all failure scenarios
Requirements
- .NET 10.0 or higher
Documentation
- Full Documentation: GitHub Repository
- API Reference: GitHub Wiki
- Examples: Sample Projects
Support
- Issues: Report bugs or request features
- Discussions: Ask questions
License
This project is licensed under the MIT License.
Contributing
Contributions are welcome! Please see our Contributing Guide for details.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- SqlKata (>= 2.4.0)
- SqlKata.Execution (>= 2.4.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated | |
|---|---|---|---|
| 2.0.13 | 55 | 3/17/2026 | |
| 2.0.12 | 69 | 3/5/2026 | |
| 2.0.9 | 66 | 3/3/2026 | |
| 2.0.8 | 58 | 3/3/2026 | |
| 2.0.7 | 62 | 3/3/2026 | |
| 2.0.6 | 64 | 3/2/2026 | |
| 2.0.5 | 62 | 3/2/2026 | |
| 2.0.4 | 65 | 2/28/2026 | |
| 2.0.3 | 67 | 2/27/2026 | |
| 2.0.2 | 66 | 2/26/2026 | |
| 2.0.1 | 60 | 2/25/2026 | |
| 2.0.0 | 62 | 2/24/2026 | |
| 1.0.10 | 66 | 2/24/2026 | |
| 1.0.9 | 60 | 2/24/2026 | |
| 1.0.8 | 58 | 2/23/2026 | |
| 1.0.7 | 59 | 2/23/2026 | |
| 1.0.6 | 63 | 2/23/2026 | |
| 1.0.5 | 60 | 2/23/2026 |
See CHANGELOG.md for detailed release notes.