Analyzer that provides embedded SQL syntax analysis when writing queries using Npgsql.FSharp. It verifies query syntax, checks the parameters in the query match with the provided parameters and performs type-checking on the functions that read columns from the result sets.
| Package | Stable | Prerelease |
|---|---|---|
| NpgsqlFSharpAnalyzer |
The analyzer requires an environment variable named NPGSQL_FSHARP with a connection string that points to your development database. The analyzers uses this connection string to retrieve the schema of the database as well as the schema and column output of the result sets from the various queries.
Use paket to install the analyzer into a specialized Analyzers dependency group like this:
paket add NpgsqlFSharpAnalyzer --group Analyzers
DO NOT use storage:none because we want the analyzer package to be downloaded physically into packages/analyzers directory.
Make sure you have these settings in Ionide for FSharp
{
"FSharp.enableAnalyzers": true,
"FSharp.analyzersPath": [
"./packages/analyzers"
]
}Which instructs Ionide to load the analyzers from the directory of the analyzers into which NpgsqlFSharpAnalyzer was installed.
When it is not convenient to write a query inline like this:
Sql.query "HERE COMES A VERY LONG QUERY"You can define the query as a module-level [<Literal>] string and use it from the Sql.query function like this:
let [<Literal>] selectActiveUsers = """
SELECT * FROM users
WHERE is_active = @is_active
"""
let activeUsers (connectionString: string) =
connectionString
|> Sql.connect
|> Sql.query selectActiveUsers
|> Sql.parameters [ "is_active", Sql.Value true ]
|> Sql.executeReader (Sql.readRow >> Sql.readString "username")The [<Literal>] string has to be defined in the same module in order for the analysis to run properly.
Better yet, if you use the FSharp.Data.LiteralProviders package, you can write your SQL queries in external SQL files and load them in compile-time as a [<Literal>] string to allow for the analyzer to pick it up:
let [<Literal>] selectActiveUsers = TextFile<"selectActiveUsers.sql">.Text
let activeUsers (connectionString: string) =
connectionString
|> Sql.connect
|> Sql.query selectActiveUsers
|> Sql.parameters [ "is_active", Sql.Value true ]
|> Sql.executeReader (Sql.readRow >> Sql.readString "username")Just remember that these [<Literal>] strings have to defined in the same module where the query is written.
Make sure the following requirements are installed on your system:
- dotnet SDK 3.0 or higher
- Mono if you're on Linux or macOS.
- Postgres database server
or
> build.cmd <optional buildtarget> // on windows
$ ./build.sh <optional buildtarget>// on unixThe tests create and dispose a test database dynamically so you don't need to setup anything. This database is created using a default connection string that connects to your local Postgres server like this:
let createTestDatabase() =
Sql.host "localhost"
|> Sql.port 5432
|> Sql.username "postgres"
|> Sql.password "postgres"
|> Sql.str
|> ThrowawayDatabase.CreateMake sure you have a user with username and password called postgres.
Clean- Cleans artifact and temp directories.DotnetRestore- Runs dotnet restore on the solution file.DotnetBuild- Runs dotnet build on the solution file.DotnetTest- Runs dotnet test on the solution file.GenerateCoverageReport- Code coverage is run duringDotnetTestand this generates a report via ReportGenerator.WatchTests- Runs dotnet watch with the test projects. Useful for rapid feedback loops.GenerateAssemblyInfo- Generates AssemblyInfo for libraries.DotnetPack- Runs dotnet pack. This includes running Source Link.PublishToNuGet- Publishes the NuGet packages generated inDotnetPackto NuGet via paket push.BuildDocs- Generates Documentation fromdocsSrcand the XML Documentation Comments from your libraries insrc.WatchDocs- Generates documentation and starts a webserver locally. It will rebuild and hot reload if it detects any changes made todocsSrcfiles, libraries insrc, or thedocsToolitself.ReleaseDocs- Will stage, commit, and push docs generated in theBuildDocstarget.
