porsager / postgres
Postgres.js - The Fastest full featured PostgreSQL client for Node.js, Deno, Bun and CloudFlare
AI Architecture Analysis
This repository is indexed by RepoMind. By analyzing porsager/postgres in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Overview (README excerpt)
Crawler view• 🚀 Fastest full-featured node & deno client • 🏷 ES6 Tagged Template Strings at the core • 🏄♀️ Simple surface API • 🖊️ Dynamic query support • 💬 Chat and help on Gitter • 🐦 Follow on Twitter Getting started Installation Usage Create your database instance Simply import for use elsewhere ESM dynamic imports The library can be used with ESM dynamic imports as well as shown here. Table of Contents • Connection • Queries • Building queries • Advanced query methods • Transactions • Data Transformation • Listen & notify • Realtime subscribe • Numbers, bigint, numeric • Result Array • Connection details • Custom Types • Teardown / Cleanup • Error handling • TypeScript support • Reserving connections • Changelog Connection You can use either a url connection string or the options to define your database connection properties. Options in the object will override any present in the url. Options will fall back to the same environment variables as psql. More options can be found in the Connection details section. Queries Postgres.js utilizes Tagged template functions to process query parameters **before** interpolation. Using tagged template literals benefits developers by: • **Enforcing** safe query generation • Giving the function powerful utility and query building features. Any generic value will be serialized according to an inferred type, and replaced by a PostgreSQL protocol placeholder . The parameters are then sent separately to the database which handles escaping & casting. All queries will return a array, with objects mapping column names to each row. > Please note that queries are first executed when – or instantly by using . Query parameters Parameters are automatically extracted and handled by the database so that SQL injection isn't possible. No special handling is necessary, simply use tagged template literals as usual. > Be careful with quotation marks here. Because Postgres infers column types, you do not need to wrap your interpolated parameters in quotes like . This will cause an error because the tagged template replaces with in the query string, leaving Postgres to do the interpolation. If you wrap that in a string, Postgres will see and interpret it as a string as opposed to a parameter. Dynamic column selection Dynamic inserts **You can omit column names and simply execute to get all the fields from the object as columns**. Be careful not to allow users to supply columns that you do not want to be inserted. Multiple inserts in one query If you need to insert multiple rows at the same time it's also much faster to do it with a single . Simply pass an array of objects to . Dynamic columns in updates This is also useful for update queries Multiple updates in one query To create multiple updates in a single query, it is necessary to use arrays instead of objects to ensure that the order of the items correspond with the column names. Dynamic values and Value lists can also be created dynamically, making queries simple too. or Building queries Postgres.js features a simple dynamic query builder by conditionally appending/omitting query fragments. It works by nesting fragments within other calls or fragments. This allows you to build dynamic queries safely without risking sql injections through usual string concatenation. Partial queries Dynamic filters Dynamic ordering SQL functions Using keywords or calling functions dynamically is also possible by using fragments. Table names Dynamic identifiers like table names and column names is also supported like so: Quick primer on interpolation Here's a quick oversight over all the ways to do interpolation in a query template string: | Interpolation syntax | Usage | Example | | ------------- | ------------- | ------------- | | | for keywords or sql fragments | await sql order by age desc | | | for identifiers | await sql | | | for helpers | await sql | | | for values | await sql | Advanced query methods Cursors Use cursors if you need to throttle the amount of rows being returned from a query. You can use a cursor either as an async iterable or with a callback function. For a callback function new results won't be requested until the promise / async callback function has resolved. callback function for await...of A single row will be returned by default, but you can also request batches by setting the number of rows desired in each batch as the first argument to : If an error is thrown inside the callback function no more rows will be requested and the outer promise will reject with the thrown error. You can close the cursor early either by calling in the loop, or by returning the token from the callback function. Instant iteration If you want to handle rows returned by a query one by one, you can use which returns a promise that resolves once there are no more rows. Query Descriptions Rather than executing a given query, will return information utilized in the query process. This information can include the query identifier, column types, etc. This is useful for debugging and analyzing your Postgres queries. Furthermore, ** will give you access to the final generated query string that would be executed.** Rows as Array of Values Using will return rows as an array of values for each column, instead of objects. This can be useful to receive identically named columns, or for specific performance/transformation reasons. The column definitions are still included on the result array, plus access to parsers for each column. Rows as Raw Array of Buffers Using will return rows as an array with values for each column, instead of objects. This can be useful for specific performance/transformation reasons. The column definitions are still included on the result array, plus access to parsers for each column. Queries in Files Using a file for a query is also supported with optional parameters to use if the file includes Multiple statements in one query The postgres wire protocol supports "sim…