The current API status is BETA . Minor changes are possible before leaving this status. After fixing the state, JSON-schemas will be published, with a description of the formats of requests and responses
Authentication
The holistic.dev API uses API key to authenticate requests. You can view and manage your API key in the Account Settings.
The holistic.dev APIs is a REST-based service. Subsequently, all requests to the APIs require this HTTP header :
Copy x-api-key : Your API key
Content type
The holistic.dev APIs is also a JSON-based service. You have to add Content-Type HTTP header to all your requests:
Copy Content - Type : application / json
URL and API versioning
Only one API version is operating at this time. Use this base URL for your requests:
https://api.holistic.dev/api/v1
Responses
All responses are JSON-based and follow one of these formats:
Copy // successful response
{
"status" : "OK" ,
"data" : { ... }
}
}
Copy // error response
{
"status" : "ERROR" ,
"data" : {
"code" : < http-code >,
"message": "< error-message >",
"details": <mixed-object(optional)>,
}
}
Http codes and error message you can find at Errors section
Projects
The project represents one database. Each project has a name and database type . Each project can contain multiple environments (development, test, stage, production) and various version control system branches .
Multiple environment and multiple branches will be avalible soon.
Create project
project
POST
https://api.holistic.dev/api/v1/project
Create new project
Request Body
200 project.uuid - unique project identifier
Copy {
"status": "OK",
"data": {
"project": {
"date": "2020-01-01T00:00:00.000Z",
"name": "project name",
"uuid": "00000000-0000-0000-0000-000000000000"
}
}
}
Example:
JSON Bash
Copy {
"project" : {
"name" : "project name" ,
"db" : "pg"
}
}
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_NAME = "<project-name>" ; \
echo "{\"project\":{\"name\":\"$HOLISTICDEV_PROJECT_NAME\",\"db\":\"pg\"}}" | \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request POST --data @- https://api.holistic.dev/api/v1/project/
Rename project
project
PATCH
https://api.holistic.dev/api/v1/project
Rename project
Request Body
200 project.name - project new name
Copy {
"status": "OK",
"data": {
"project": {
"name": "project name",
"uuid": "00000000-0000-0000-0000-000000000000"
}
}
}
Example:
JSON Bash
Copy {
"project" : {
"name" : "project name" ,
"uuid" : "00000000-0000-0000-0000-000000000000"
}
}
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_NAME = "<project-name>" ; \
echo "{\"project\":{\"name\":\"$HOLISTICDEV_PROJECT_NAME\",\"db\":\"pg\"}}" | \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request PATCH --data @- https://api.holistic.dev/api/v1/project/
List projects
project
GET
https://api.holistic.dev/api/v1/project
Get projects list
200
Copy {
"data": [
{
"ddl": {
"ast": {
"elements": {
"comment": 62,
"parsed": 31
},
"errors": 0
},
"compiled": {
"functions": 0,
"operators": 0,
"relations": 12,
"schemas": 0,
"types": 0
},
"date": "2020-01-01T00:00:00.000Z",
"files": 2,
"issues": 0,
"uuid": "00000000-0000-0000-0000-000000000000"
},
"dml": {
"count": 1
},
"project": {
"date": "2020-01-01T00:00:00.000Z",
"db": "pg",
"name": "default",
"uuid": "00000000-0000-0000-0000-000000000000"
},
"users": {
"owner": "info@holistic.dev",
"shared": [
"support@holistic.dev"
]
}
}
],
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET https://api.holistic.dev/api/v1/project/
Project details
project/:uuid/details
GET
https://api.holistic.dev/api/v1/project/:uuid/details
Get projects list
Path Parameters
200
Copy {
"data": {
"project": {
"date": "2020-01-01T00:00:00.000Z",
"db": "pg",
"name": "default",
"uuid": "00000000-0000-0000-0000-000000000000"
},
"users": {
"owner": "info@holistic.dev",
"shared": [
"support@holistic.dev"
]
}
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_UUID = "<project-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/project/$HOLISTICDEV_PROJECT_UUID/details"
Project DDL details
project/:uuid/ddl
GET
https://api.holistic.dev/api/v1/project/:uuid/ddl
Get projects list
Path Parameters
200
Copy {
"data": {
"ddl": {
"ast": {
"elements": {
"comment": 62,
"parsed": 31
},
"errors": 0
},
"check": {
"status": "finished"
},
"compiled": {
"functions": 0,
"operators": 0,
"relations": 12,
"schemas": 0,
"types": 0
},
"date": "2020-01-01T00:00:00.000Z",
"files": 2,
"issues": 37,
"uuid": "00000000-0000-0000-0000-000000000000",
"version": null
}
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_UUID = "<project-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/project/$HOLISTICDEV_PROJECT_UUID/ddl"
Project DMLs list
project/:uuid/dml/list
GET
https://api.holistic.dev/api/v1/project/:uuid/dml/list
Get projects list
Path Parameters
200
Copy {
"data": [
{
"dml": {
"check": {
"status": "finished"
},
"date": "booking-info.sql",
"issues": 0,
"name": "sql-name",
"source": {
"from": "api",
"sql": "SELECT \n b.book_ref,\n t.ticket_no,\n t.passenger_id,\n t.passenger_name,\n tf.fare_conditions,\n tf.amount,\n f.scheduled_departure_local,\n f.scheduled_arrival_local,\n f.departure_city || '(' || f.departure_airport || ')' as departure,\n f.arrival_city || '(' || f.arrival_airport || ')' as arrival,\n f.status,\n bp.seat_no\nFROM\n bookings b\n JOIN tickets t ON b.book_ref = t.book_ref\n JOIN ticket_flights tf ON tf.ticket_no = t.ticket_no\n JOIN flights_v f ON tf.flight_id = f.flight_id\n LEFT JOIN boarding_passes bp ON tf.flight_id = bp.flight_id AND tf.ticket_no = bp.ticket_no\nWHERE\n b.book_ref = '_QWE12'\nORDER BY\n t.ticket_no,\n f.scheduled_departure"
},
"uuid": "00000000-0000-0000-0000-000000000000",
"version": null
}
}
],
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_UUID = "<project-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/project/$HOLISTICDEV_PROJECT_UUID/dml/list"
SQL Syntax
At this time, holistic.dev support only PostgreSQL syntax. We have implemented the last original parser from PostgreSQL 13 . So, we support modernist PostgreSQL features.
PostgreSQL's syntax includes all standard SQL syntax, so, quite possibly, you can parse queries written for other RDBMS - Mysql, MSSQL, ORACLE, and more. But all analyzer rules aim to find specific PostgreSQL behavior.
DDL can contain multiple files. Each of them can include a lot of DDL statements. All unknown or syntactically incorrect statements will be ignored.
DML can contain only one DML statement . All other statements will be ignored. DML can contain one of the following parameter syntaxes:
Copy -- SUPPORTED:
SELECT $ 1 , ${obj.name} , $ / obj.name / , $[obj.name], $(obj.name)
Warning!
Parameter as question mark not supported!
Copy -- NOT SUPPORTED !!!
SELECT ?
Database schema (DDL)
DDL, aka Data Definition Language, is an SQL subset that includes CREATE , ALTER , and DROP statements. It uses to define database structure. Also, can include DML statements with extension's commands like create_hypertable() from TimescaleDB . All supported extensions you can find in the extensions list .
Knowledge about database structure is the critical requirement for SQL-queries static analysis. We require to upload the database structure described in DDL syntax before process any DML queries.
If you store database schema in your version control system, you can directly upload it for any project.
DDL can contain multiple files, and we store it as is for better navigation.
In case you have not DDL synchronized with production database in your version control system, you should extract DDL directly from target database:
Bash
Copy PGPASSWORD =< pg-password > pg_dump -h < pg-hos t > -p < pg-por t > -U < pg-usernam e > \
-d < pg-db-nam e > --schema-only --no-owner --no-privileges --no-security-labels \
> ddl.sql
database parameters:
pg_dump utility knows nothing about extension's necessary routines. E.g., commands like create_hypertable() from TimescaleDB you should add by yourself. We recommend storing it in a separate file and upload both.
Upload DDL
ddl
POST
https://api.holistic.dev/api/v1/ddl
Upload brand new ddl or replace existing ddl with new version
Request Body
200 ddl.uuid - unique ddl identifier
Copy {
"status" : "OK" ,
"data" : {
"ddl" : {
"uuid" : "00000000-0000-0000-0000-000000000000"
}
}
}
Example:
JSON Bash
Copy {
"project" : {
"name" : "default"
} ,
"ddl" : {
"version" : null
} ,
"files" : [
{
"name" : "filename-1.sql" ,
"source": "CREATE TABLE aircrafts_data (\n aircraft_code character(3) NOT NULL,\n model jsonb NOT NULL,\n range integer NOT NULL,\n CONSTRAINT aircrafts_range_check CHECK ((range > 0))\n);\n\nCREATE VIEW aircrafts AS\n SELECT ml.aircraft_code,\n (ml.model ->> lang()) AS model,\n ml.range\n FROM aircrafts_data ml;\n\n\n\nCREATE TABLE airports_data (\n airport_code character(3) NOT NULL,\n airport_name jsonb NOT NULL,\n city jsonb NOT NULL,\n coordinates point NOT NULL,\n timezone text NOT NULL\n);\n\n\n\nCREATE VIEW airports AS\n SELECT ml.airport_code,\n (ml.airport_name ->> lang()) AS airport_name,\n (ml.city ->> lang()) AS city,\n ml.coordinates,\n ml.timezone\n FROM airports_data ml;\n\n\n\nCREATE TABLE boarding_passes (\n ticket_no character(13) NOT NULL,\n flight_id integer NOT NULL,\n boarding_no integer NOT NULL,\n seat_no character varying(4) NOT NULL\n);"
} ,
{
"name" : "filename-2.sql" ,
"source": "CREATE TABLE bookings (\n book_ref character(6) NOT NULL,\n book_date timestamp with time zone NOT NULL,\n total_amount numeric(10,2) NOT NULL\n);\n\n\nCREATE TABLE flights (\n flight_id integer NOT NULL,\n flight_no character(6) NOT NULL,\n scheduled_departure timestamp with time zone NOT NULL,\n scheduled_arrival timestamp with time zone NOT NULL,\n departure_airport character(3) NOT NULL,\n arrival_airport character(3) NOT NULL,\n status character varying(20) NOT NULL,\n aircraft_code character(3) NOT NULL,\n actual_departure timestamp with time zone,\n actual_arrival timestamp with time zone,\n CONSTRAINT flights_check CHECK ((scheduled_arrival > scheduled_departure)),\n CONSTRAINT flights_check1 CHECK (((actual_arrival IS NULL) OR ((actual_departure IS NOT NULL) AND (actual_arrival IS NOT NULL) AND (actual_arrival > actual_departure)))),\n CONSTRAINT flights_status_check CHECK (((status)::text = ANY (ARRAY[('On Time'::character varying)::text, ('Delayed'::character varying)::text, ('Departed'::character varying)::text, ('Arrived'::character varying)::text, ('Scheduled'::character varying)::text, ('Cancelled'::character varying)::text])))\n);\n\n\nCREATE VIEW flights_v AS\n SELECT f.flight_id,\n f.flight_no,\n f.scheduled_departure,\n timezone(dep.timezone, f.scheduled_departure) AS scheduled_departure_local,\n f.scheduled_arrival,\n timezone(arr.timezone, f.scheduled_arrival) AS scheduled_arrival_local,\n (f.scheduled_arrival - f.scheduled_departure) AS scheduled_duration,\n f.departure_airport,\n dep.airport_name AS departure_airport_name,\n dep.city AS departure_city,\n f.arrival_airport,\n arr.airport_name AS arrival_airport_name,\n arr.city AS arrival_city,\n f.status,\n f.aircraft_code,\n f.actual_departure,\n timezone(dep.timezone, f.actual_departure) AS actual_departure_local,\n f.actual_arrival,\n timezone(arr.timezone, f.actual_arrival) AS actual_arrival_local,\n (f.actual_arrival - f.actual_departure) AS actual_duration\n FROM flights f,\n airports dep,\n airports arr\n WHERE ((f.departure_airport = dep.airport_code) AND (f.arrival_airport = arr.airport_code));\n\n\n\nCREATE VIEW routes AS\n WITH f3 AS (\n SELECT f2.flight_no,\n f2.departure_airport,\n f2.arrival_airport,\n f2.aircraft_code,\n f2.duration,\n array_agg(f2.days_of_week) AS days_of_week\n FROM ( SELECT f1.flight_no,\n f1.departure_airport,\n f1.arrival_airport,\n f1.aircraft_code,\n f1.duration,\n f1.days_of_week\n FROM ( SELECT flights.flight_no,\n flights.departure_airport,\n flights.arrival_airport,\n flights.aircraft_code,\n (flights.scheduled_arrival - flights.scheduled_departure) AS duration,\n (to_char(flights.scheduled_departure, 'ID'::text))::integer AS days_of_week\n FROM flights) f1\n GROUP BY f1.flight_no, f1.departure_airport, f1.arrival_airport, f1.aircraft_code, f1.duration, f1.days_of_week\n ORDER BY f1.flight_no, f1.departure_airport, f1.arrival_airport, f1.aircraft_code, f1.duration, f1.days_of_week) f2\n GROUP BY f2.flight_no, f2.departure_airport, f2.arrival_airport, f2.aircraft_code, f2.duration\n )\n SELECT f3.flight_no,\n f3.departure_airport,\n dep.airport_name AS departure_airport_name,\n dep.city AS departure_city,\n f3.arrival_airport,\n arr.airport_name AS arrival_airport_name,\n arr.city AS arrival_city,\n f3.aircraft_code,\n f3.duration,\n f3.days_of_week\n FROM f3,\n airports dep,\n airports arr\n WHERE ((f3.departure_airport = dep.airport_code) AND (f3.arrival_airport = arr.airport_code));\n\n\nCREATE TABLE seats (\n aircraft_code character(3) NOT NULL,\n seat_no character varying(4) NOT NULL,\n fare_conditions character varying(10) NOT NULL,\n CONSTRAINT seats_fare_conditions_check CHECK (((fare_conditions)::text = ANY (ARRAY[('Economy'::character varying)::text, ('Comfort'::character varying)::text, ('Business'::character varying)::text])))\n);\n\n\nCREATE TABLE ticket_flights (\n ticket_no character(13) NOT NULL,\n flight_id integer NOT NULL,\n fare_conditions character varying(10) NOT NULL,\n amount numeric(10,2) NOT NULL,\n CONSTRAINT ticket_flights_amount_check CHECK ((amount >= (0)::numeric)),\n CONSTRAINT ticket_flights_fare_conditions_check CHECK (((fare_conditions)::text = ANY (ARRAY[('Economy'::character varying)::text, ('Comfort'::character varying)::text, ('Business'::character varying)::text])))\n);\n\n\n\nCREATE TABLE tickets (\n ticket_no character(13) NOT NULL,\n book_ref character(6) NOT NULL,\n passenger_id character varying(20) NOT NULL,\n passenger_name text NOT NULL,\n contact_data jsonb\n);\n\n\nALTER TABLE ONLY aircrafts_data\n ADD CONSTRAINT aircrafts_pkey PRIMARY KEY (aircraft_code);\n\n\nALTER TABLE ONLY airports_data\n ADD CONSTRAINT airports_data_pkey PRIMARY KEY (airport_code);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_flight_id_boarding_no_key UNIQUE (flight_id, boarding_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_flight_id_seat_no_key UNIQUE (flight_id, seat_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_pkey PRIMARY KEY (ticket_no, flight_id);\n\nALTER TABLE ONLY bookings\n ADD CONSTRAINT bookings_pkey PRIMARY KEY (book_ref);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_flight_no_scheduled_departure_key UNIQUE (flight_no, scheduled_departure);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_pkey PRIMARY KEY (flight_id);\n\n\nALTER TABLE ONLY seats\n ADD CONSTRAINT seats_pkey PRIMARY KEY (aircraft_code, seat_no);\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_pkey PRIMARY KEY (ticket_no, flight_id);\n\n\nALTER TABLE ONLY tickets\n ADD CONSTRAINT tickets_pkey PRIMARY KEY (ticket_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_ticket_no_fkey FOREIGN KEY (ticket_no, flight_id) REFERENCES ticket_flights(ticket_no, flight_id);\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_aircraft_code_fkey FOREIGN KEY (aircraft_code) REFERENCES aircrafts_data(aircraft_code);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_arrival_airport_fkey FOREIGN KEY (arrival_airport) REFERENCES airports_data(airport_code);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_departure_airport_fkey FOREIGN KEY (departure_airport) REFERENCES airports_data(airport_code);\n\n\nALTER TABLE ONLY seats\n ADD CONSTRAINT seats_aircraft_code_fkey FOREIGN KEY (aircraft_code) REFERENCES aircrafts_data(aircraft_code) ON DELETE CASCADE;\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_flight_id_fkey FOREIGN KEY (flight_id) REFERENCES flights(flight_id);\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_ticket_no_fkey FOREIGN KEY (ticket_no) REFERENCES tickets(ticket_no);\n\n\n\nALTER TABLE ONLY tickets\n ADD CONSTRAINT tickets_book_ref_fkey FOREIGN KEY (book_ref) REFERENCES bookings(book_ref);"
}
]
}
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_NAME = "<project-name>" ; \
DATA = $( cat ddl.sql | base64 -w0 )
echo "{\"project\":{\"name\":\"$HOLISTICDEV_PROJECT_NAME\"},\"ddl\":{\"version\":null},\"files\":[{\"name\":\"ddl.sql\",\"source\":\"$DATA\"}]}" | \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request POST --data @- https://api.holistic.dev/api/v1/ddl/
base64 argument -w0 need to prevent formatting result at Linux-based os
All files of the previous version will be replaced, even if their number does not match the number of files of the new version.
We store the history of all schema changes for future features.
Also, you can copy bash script with credentials from project card from project list inside client area :
Patch DDL
ddl
PATCH
https://api.holistic.dev/api/v1/ddl
Patch existing ddl with new DDL commands
Request Body
200 ddl.uuid - unique ddl identifier
Copy {
"status" : "OK" ,
"data" : {
"ddl" : {
"uuid" : "00000000-0000-0000-0000-000000000000"
}
}
}
Example:
JSON
Copy {
"project" : {
"name" : "default"
} ,
"ddl" : {
"version" : null
} ,
"source" : "CREATE INDEX ON aircrafts_data (aircraft_code)"
}
Patch existing DDL with new DDL commands. This source will be attached at the end of the existing DDL in a separate file. DDL and ALL project DMLs will be re-parsed after this.
Get DDL source
ddl/:uuid/source
GET
https://api.holistic.dev/api/v1/ddl/:uuid/source
Get DDL source by DDL uuid
Path Parameters
200
Copy {
"data" : {
"ddl" : [
{
"file" : {
"name" : "filename-1.sql" ,
"ordernum" : 0 ,
"source": "CREATE TABLE aircrafts_data (\n aircraft_code character(3) NOT NULL,\n model jsonb NOT NULL,\n range integer NOT NULL,\n CONSTRAINT aircrafts_range_check CHECK ((range > 0))\n);\n\nCREATE VIEW aircrafts AS\n SELECT ml.aircraft_code,\n (ml.model ->> lang()) AS model,\n ml.range\n FROM aircrafts_data ml;\n\n\n\nCREATE TABLE airports_data (\n airport_code character(3) NOT NULL,\n airport_name jsonb NOT NULL,\n city jsonb NOT NULL,\n coordinates point NOT NULL,\n timezone text NOT NULL\n);\n\n\n\nCREATE VIEW airports AS\n SELECT ml.airport_code,\n (ml.airport_name ->> lang()) AS airport_name,\n (ml.city ->> lang()) AS city,\n ml.coordinates,\n ml.timezone\n FROM airports_data ml;\n\n\n\nCREATE TABLE boarding_passes (\n ticket_no character(13) NOT NULL,\n flight_id integer NOT NULL,\n boarding_no integer NOT NULL,\n seat_no character varying(4) NOT NULL\n);"
} ,
"uuid" : "00000000-0000-0000-0000-000000000000" ,
"version" : null
} ,
{
"file" : {
"name" : "filename-2.sql" ,
"ordernum" : 2 ,
"source": "CREATE TABLE bookings (\n book_ref character(6) NOT NULL,\n book_date timestamp with time zone NOT NULL,\n total_amount numeric(10,2) NOT NULL\n);\n\n\nCREATE TABLE flights (\n flight_id integer NOT NULL,\n flight_no character(6) NOT NULL,\n scheduled_departure timestamp with time zone NOT NULL,\n scheduled_arrival timestamp with time zone NOT NULL,\n departure_airport character(3) NOT NULL,\n arrival_airport character(3) NOT NULL,\n status character varying(20) NOT NULL,\n aircraft_code character(3) NOT NULL,\n actual_departure timestamp with time zone,\n actual_arrival timestamp with time zone,\n CONSTRAINT flights_check CHECK ((scheduled_arrival > scheduled_departure)),\n CONSTRAINT flights_check1 CHECK (((actual_arrival IS NULL) OR ((actual_departure IS NOT NULL) AND (actual_arrival IS NOT NULL) AND (actual_arrival > actual_departure)))),\n CONSTRAINT flights_status_check CHECK (((status)::text = ANY (ARRAY[('On Time'::character varying)::text, ('Delayed'::character varying)::text, ('Departed'::character varying)::text, ('Arrived'::character varying)::text, ('Scheduled'::character varying)::text, ('Cancelled'::character varying)::text])))\n);\n\n\nCREATE VIEW flights_v AS\n SELECT f.flight_id,\n f.flight_no,\n f.scheduled_departure,\n timezone(dep.timezone, f.scheduled_departure) AS scheduled_departure_local,\n f.scheduled_arrival,\n timezone(arr.timezone, f.scheduled_arrival) AS scheduled_arrival_local,\n (f.scheduled_arrival - f.scheduled_departure) AS scheduled_duration,\n f.departure_airport,\n dep.airport_name AS departure_airport_name,\n dep.city AS departure_city,\n f.arrival_airport,\n arr.airport_name AS arrival_airport_name,\n arr.city AS arrival_city,\n f.status,\n f.aircraft_code,\n f.actual_departure,\n timezone(dep.timezone, f.actual_departure) AS actual_departure_local,\n f.actual_arrival,\n timezone(arr.timezone, f.actual_arrival) AS actual_arrival_local,\n (f.actual_arrival - f.actual_departure) AS actual_duration\n FROM flights f,\n airports dep,\n airports arr\n WHERE ((f.departure_airport = dep.airport_code) AND (f.arrival_airport = arr.airport_code));\n\n\n\nCREATE VIEW routes AS\n WITH f3 AS (\n SELECT f2.flight_no,\n f2.departure_airport,\n f2.arrival_airport,\n f2.aircraft_code,\n f2.duration,\n array_agg(f2.days_of_week) AS days_of_week\n FROM ( SELECT f1.flight_no,\n f1.departure_airport,\n f1.arrival_airport,\n f1.aircraft_code,\n f1.duration,\n f1.days_of_week\n FROM ( SELECT flights.flight_no,\n flights.departure_airport,\n flights.arrival_airport,\n flights.aircraft_code,\n (flights.scheduled_arrival - flights.scheduled_departure) AS duration,\n (to_char(flights.scheduled_departure, 'ID'::text))::integer AS days_of_week\n FROM flights) f1\n GROUP BY f1.flight_no, f1.departure_airport, f1.arrival_airport, f1.aircraft_code, f1.duration, f1.days_of_week\n ORDER BY f1.flight_no, f1.departure_airport, f1.arrival_airport, f1.aircraft_code, f1.duration, f1.days_of_week) f2\n GROUP BY f2.flight_no, f2.departure_airport, f2.arrival_airport, f2.aircraft_code, f2.duration\n )\n SELECT f3.flight_no,\n f3.departure_airport,\n dep.airport_name AS departure_airport_name,\n dep.city AS departure_city,\n f3.arrival_airport,\n arr.airport_name AS arrival_airport_name,\n arr.city AS arrival_city,\n f3.aircraft_code,\n f3.duration,\n f3.days_of_week\n FROM f3,\n airports dep,\n airports arr\n WHERE ((f3.departure_airport = dep.airport_code) AND (f3.arrival_airport = arr.airport_code));\n\n\nCREATE TABLE seats (\n aircraft_code character(3) NOT NULL,\n seat_no character varying(4) NOT NULL,\n fare_conditions character varying(10) NOT NULL,\n CONSTRAINT seats_fare_conditions_check CHECK (((fare_conditions)::text = ANY (ARRAY[('Economy'::character varying)::text, ('Comfort'::character varying)::text, ('Business'::character varying)::text])))\n);\n\n\nCREATE TABLE ticket_flights (\n ticket_no character(13) NOT NULL,\n flight_id integer NOT NULL,\n fare_conditions character varying(10) NOT NULL,\n amount numeric(10,2) NOT NULL,\n CONSTRAINT ticket_flights_amount_check CHECK ((amount >= (0)::numeric)),\n CONSTRAINT ticket_flights_fare_conditions_check CHECK (((fare_conditions)::text = ANY (ARRAY[('Economy'::character varying)::text, ('Comfort'::character varying)::text, ('Business'::character varying)::text])))\n);\n\n\n\nCREATE TABLE tickets (\n ticket_no character(13) NOT NULL,\n book_ref character(6) NOT NULL,\n passenger_id character varying(20) NOT NULL,\n passenger_name text NOT NULL,\n contact_data jsonb\n);\n\n\nALTER TABLE ONLY aircrafts_data\n ADD CONSTRAINT aircrafts_pkey PRIMARY KEY (aircraft_code);\n\n\nALTER TABLE ONLY airports_data\n ADD CONSTRAINT airports_data_pkey PRIMARY KEY (airport_code);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_flight_id_boarding_no_key UNIQUE (flight_id, boarding_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_flight_id_seat_no_key UNIQUE (flight_id, seat_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_pkey PRIMARY KEY (ticket_no, flight_id);\n\nALTER TABLE ONLY bookings\n ADD CONSTRAINT bookings_pkey PRIMARY KEY (book_ref);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_flight_no_scheduled_departure_key UNIQUE (flight_no, scheduled_departure);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_pkey PRIMARY KEY (flight_id);\n\n\nALTER TABLE ONLY seats\n ADD CONSTRAINT seats_pkey PRIMARY KEY (aircraft_code, seat_no);\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_pkey PRIMARY KEY (ticket_no, flight_id);\n\n\nALTER TABLE ONLY tickets\n ADD CONSTRAINT tickets_pkey PRIMARY KEY (ticket_no);\n\n\nALTER TABLE ONLY boarding_passes\n ADD CONSTRAINT boarding_passes_ticket_no_fkey FOREIGN KEY (ticket_no, flight_id) REFERENCES ticket_flights(ticket_no, flight_id);\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_aircraft_code_fkey FOREIGN KEY (aircraft_code) REFERENCES aircrafts_data(aircraft_code);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_arrival_airport_fkey FOREIGN KEY (arrival_airport) REFERENCES airports_data(airport_code);\n\n\nALTER TABLE ONLY flights\n ADD CONSTRAINT flights_departure_airport_fkey FOREIGN KEY (departure_airport) REFERENCES airports_data(airport_code);\n\n\nALTER TABLE ONLY seats\n ADD CONSTRAINT seats_aircraft_code_fkey FOREIGN KEY (aircraft_code) REFERENCES aircrafts_data(aircraft_code) ON DELETE CASCADE;\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_flight_id_fkey FOREIGN KEY (flight_id) REFERENCES flights(flight_id);\n\n\nALTER TABLE ONLY ticket_flights\n ADD CONSTRAINT ticket_flights_ticket_no_fkey FOREIGN KEY (ticket_no) REFERENCES tickets(ticket_no);\n\n\n\nALTER TABLE ONLY tickets\n ADD CONSTRAINT tickets_book_ref_fkey FOREIGN KEY (book_ref) REFERENCES bookings(book_ref);"
} ,
"uuid" : "00000000-0000-0000-0000-000000000000" ,
"version" : null
}
] ,
"project" : {
"name" : "default" ,
"uuid" : "00000000-0000-0000-0000-000000000000"
}
} ,
"status" : "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DDL_UUID = "<ddl-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/ddl/$HOLISTICDEV_DDL_UUID/source/"
Get DDL abstract syntax tree
ddl/:uuid/ast
GET
https://api.holistic.dev/api/v1/ddl/:uuid/ast
Get DDL abstract syntax tree (AST) by DDL uuid
Path Parameters
200
Copy {
"data" : {
"ddl" : {
"ast" : [< abstract syntax tree >],
},
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DDL_UUID = "<ddl-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/ddl/$HOLISTICDEV_DDL_UUID/ast/"
Get DDL compiled object
ddl/:uuid/object
GET
https://api.holistic.dev/api/v1/ddl/:uuid/object
Get DDL compiled object by DDL uuid
Path Parameters
Query Parameters
200
Copy {
"data" : {
"ddl" : {
"object" : {
"types" : [< user defined types >],
"functions": [< user defined functions >],
"operators": [< user defined operators >],
"sequences": [< sequences >],
"schemas": [< non-default schemas >],
"extensions": [< extensions >],
"relations": [< relations >]
},
},
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DDL_UUID = "<ddl-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/ddl/$HOLISTICDEV_DDL_UUID/object/"
SQL query (DML)
DML, aka Data Manipulation Language, is an SQL subset that includes SELECT , INSERT , UPDATE , and DELETE statements.
Upload DML
dml
POST
https://api.holistic.dev/api/v1/dml
Upload brand new dml or replace existing dml with new version
Request Body
200 dml.uuid - unique ddl identifier
Copy {
"status" : "OK" ,
"data" : {
"dml" : {
"uuid" : "00000000-0000-0000-0000-000000000000"
}
}
}
Example:
JSON Bash
Copy {
"project" : {
"name" : "default"
} ,
"ddl" : {
"version" : null
} ,
"dml" : {
"name" : "booking-info.sql" ,
"version" : null ,
"source" : {
"sql": "SELECT \n b.book_ref,\n t.ticket_no,\n t.passenger_id,\n t.passenger_name,\n tf.fare_conditions,\n tf.amount,\n f.scheduled_departure_local,\n f.scheduled_arrival_local,\n f.departure_city || '(' || f.departure_airport || ')' as departure,\n f.arrival_city || '(' || f.arrival_airport || ')' as arrival,\n f.status,\n bp.seat_no\nFROM\n bookings b\n JOIN tickets t ON b.book_ref = t.book_ref\n JOIN ticket_flights tf ON tf.ticket_no = t.ticket_no\n JOIN flights_v f ON tf.flight_id = f.flight_id\n LEFT JOIN boarding_passes bp ON tf.flight_id = bp.flight_id AND tf.ticket_no = bp.ticket_no\nWHERE\n b.book_ref = '_QWE12'\nORDER BY\n t.ticket_no,\n f.scheduled_departure"
}
}
}
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_NAME = "<project-name>" ; \
DATA = $( cat ddl.sql | base64 -w0 )
echo "{\"project\":{\"name\":\"$HOLISTICDEV_PROJECT_NAME\"},\"ddl\":{\"version\":null},\"dml\":{\"name\":\"dml.sql\", \"version\": null, \"source\":{ \"sql\":\"$DATA\"}}}" | \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request POST --data @- https://api.holistic.dev/api/v1/dml/
base64 argument -w0 need to prevent formatting result at Linux-based os
We store the history of all schema changes for future features.
Also, you can copy bash script with credentials from project card from project list inside client area :
Get DML source
dml/:uuid/source
GET
https://api.holistic.dev/api/v1/dml/:uuid/source
Get DML source by DML uuid
Path Parameters
200
Copy {
"data" : {
"dml" : {
"name" : "booking-info.sql" ,
"source" : {
"from" : "api" ,
"sql": "SELECT \n b.book_ref,\n t.ticket_no,\n t.passenger_id,\n t.passenger_name,\n tf.fare_conditions,\n tf.amount,\n f.scheduled_departure_local,\n f.scheduled_arrival_local,\n f.departure_city || '(' || f.departure_airport || ')' as departure,\n f.arrival_city || '(' || f.arrival_airport || ')' as arrival,\n f.status,\n bp.seat_no\nFROM\n bookings b\n JOIN tickets t ON b.book_ref = t.book_ref\n JOIN ticket_flights tf ON tf.ticket_no = t.ticket_no\n JOIN flights_v f ON tf.flight_id = f.flight_id\n LEFT JOIN boarding_passes bp ON tf.flight_id = bp.flight_id AND tf.ticket_no = bp.ticket_no\nWHERE\n b.book_ref = '_QWE12'\nORDER BY\n t.ticket_no,\n f.scheduled_departure"
} ,
"uuid" : "00000000-0000-0000-0000-000000000000" ,
"version" : null
} ,
"project" : {
"name" : "default" ,
"uuid" : "00000000-0000-0000-0000-000000000000"
}
} ,
"status" : "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DML_UUID = "<dml-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/dml/$HOLISTICDEV_DML_UUID/source/"
Get DML abstract syntax tree
dml/:uuid/ast
GET
https://api.holistic.dev/api/v1/dml/:uuid/ast
Get DML abstract syntax tree (AST) by DML uuid
Path Parameters
200
Copy {
"data" : {
"dml" : {
"ast" : [< abstract syntax tree >],
},
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DML_UUID = "<dml-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/dml/$HOLISTICDEV_DML_UUID/ast/"
Get DML compiled object
dml/:uuid/object
GET
https://api.holistic.dev/api/v1/dml/:uuid/object
Get DML compiled object by DML uuid
Path Parameters
200
Copy {
"data" : {
"dml" : {
"object" : [< compiled object >],
},
},
"status": "OK"
}
Example:
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_DML_UUID = "<dml-uuid>" ; \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request GET "https://api.holistic.dev/api/v1/dml/$HOLISTICDEV_DML_UUID/object/"
pg_stat_statements and pg_stat_monitor
The pg_stat_statements and pg_stat_monitor modules provides a means for tracking execution statistics of all SQL statements executed by a server.
The holistic.dev API can process whole pg_stat_statements/pg_stat_monitor snapshot at one request.
Exporting the pg_stat_statements/pg_stat_monitor content is the most comfortable, most flexible, and secure way to organize the automatic export of SQL-queries from Postgresql. This extension is easy to configure for both on-premise installations and cloud providers: AWS , GCP , AZURE , ALIBABA CLOUD , DIGITAL OCEAN , YANDEX CLOUD , SBER CLOUD and others.
pg_stat_statements and pg_stat_monitor can be disabled by default. You should execute the following command to activate it:
Copy -- pg_stat_statements
CREATE EXTENSION IF NOT EXISTS pg_stat_statements;
-- pg_stat_monitor
CREATE EXTENSION pg_stat_monitor;
Privacy
pg_stat_statements and pg_stat_monitor extensions normalizes query entries. Normalization is a process whereby similar queries, typically differing only in their constants (though the exact rules are somewhat more subtle than that) are recognized as equivalent, and are tracked as a single entry. This is particularly useful for non-prepared queries.
The normalization process intercepts constants in SQL statements run by users and replaces them with a placeholder (identified as a dollar mark with number).
For this reason, all data values that were used in SQL-statement will be blacked out and will not be visible at holistic.dev
But normalization has another side.
When normalizing the parameters of functions and expressions, it is quite often impossible to define the type of result unambiguously. For example, the addition operator has 42 types of its arguments, and the normalized querySELECT
$1 + $2
can return 20 different response types. Such ambiguity negatively affects the analyzer's accuracy. The result of this will be less accurate analysis results - some rules will not be applied.
pg_stat_statements
POST
https://api.holistic.dev/api/v1/pg_stat_statements
Upload pg_stat_statements or pg_stat_monitor snapshot
Request Body
200 pgss.income - pg_stat_statements queries in your request
pgss.new - new queries, which will be analyzed
Copy {
"status" : "OK" ,
"data" : {
"pgss" : {
"income" : 100 ,
"new" : 100
}
}
}
Bash
Copy HOLISTICDEV_API_KEY = "<your-api-key>" HOLISTICDEV_PROJECT_NAME = "<project-name>" ; \
PG=$(PGPASSWORD=<pg-password> psql -t -A -h <pg-host> -p <pg-port> -U <pg-username> -d <pg-db-name> -c "SELECT json_agg(u) FROM (SELECT DISTINCT ON (queryid) queryid :: varchar as name, query as source, SUM(calls) :: varchar AS calls, SUM(total_time) :: varchar AS total_time, MIN(min_time) :: varchar AS min_time, MAX(max_time) :: varchar AS max_time, AVG(mean_time) :: varchar AS mean_time, SUM(rows) :: varchar AS rows FROM pg_stat_statements pgss JOIN pg_database pgd ON pgss.dbid = pgd.oid WHERE pgd.datname = current_database() AND queryid IS NOT NULL GROUP BY queryid, query) u" | base64 -w0); \
echo "{\"pgss\":\"$PG\", \"project\":{\"name\":\"$HOLISTICDEV_PROJECT_NAME\"}}" | \
curl \
--header "x-api-key: $HOLISTICDEV_API_KEY" \
--header "Content-Type: application/json" \
--request POST --data @- https://api.holistic.dev/api/v1/pg_stat_statements/
base64 argument -w0 need to prevent formatting result at Linux-based os
holistic.dev parameters:
<your-api-key> - your account api key
<project-name> - project name
database parameters:
You need to upload DDL (database schema) before upload pg_stat_statements snapshot!
Make sure that the <pg-username> has enough permissions to retrieve the query results. If there are problems with the output, the easiest way to fix it is DROP EXTENSION and CREATE EXTENSION again.
Query in sample script aggregat
Check Results
After adding DDL or DML source, we try to parse and analyze it. It can take some time, especially for big DDL. Because of these reasons parsing and analyzing doing asynchronous way.
When we ship new parser or analyzer, we rebuild all internal objects and analyze them with new rules.
You can reach check results for the last DDL by <project-uuid> or <ddl-uuid> .
DML check result can be reached only by <dml-uuid> .
All of them have similar results format.
UUID point to exact DDL/DML version, not the last one.
ddl/:uuid/check-result
GET
https://api.holistic.dev/api/v1/ddl/:uuid/check-result/
Path Parameters
200
Copy {
"data": {
"ddl": {
"uuid": "00000000-0000-0000-0000-000000000000"
},
"analysis": {
"ddl": [
[
{
"name": "char-type",
"description": "Recommended avoid to use a precision specification for CHAR type for column \"aircraft_code\" in relation \"aircrafts_data\"",
"location": 1,
"position": {
"line": 1,
"column": 1
}
}
]
],
"dml": [],
"config": [
{
"char-type": "warning"
}
],
"statistics": {
"ddl": {
"kind": {
"architect": 14,
"error": 0,
"performance": 13
},
"total": 25
},
"dml": {
"kind": {
"architect": 30,
"error": 0,
"performance": 21
},
"total": 38
},
"common": {
"kind": {
"architect": 7,
"error": 0,
"performance": 0
},
"total": 7
}
}
}
},
"status": "OK"
}
project/:uuid/ddl/check-result
GET
https://api.holistic.dev/api/v1/project/:uuid/ddl/check-result/
Path Parameters
200
Copy {
"data": {
"ddl": {
"uuid": "00000000-0000-0000-0000-000000000000"
},
"analysis": {
"ddl": [
[
{
"name": "char-type",
"description": "Recommended avoid to use a precision specification for CHAR type for column \"aircraft_code\" in relation \"aircrafts_data\"",
"location": 1,
"position": {
"line": 1,
"column": 1
}
}
]
],
"dml": [],
"config": [
{
"char-type": "warning"
}
],
"statistics": {
"ddl": {
"kind": {
"architect": 14,
"error": 0,
"performance": 13
},
"total": 25
},
"dml": {
"kind": {
"architect": 30,
"error": 0,
"performance": 21
},
"total": 38
},
"common": {
"kind": {
"architect": 7,
"error": 0,
"performance": 0
},
"total": 7
}
}
}
},
"status": "OK"
}
dml/:uuid/check-result
GET
https://api.holistic.dev/api/v1/dml/:uuid/check-result/
Path Parameters
200
Copy {
"data": {
"dml": {
"uuid": "00000000-0000-0000-0000-000000000000"
},
"analysis": {
"ddl": [],
"dml": [],
"config": [],
"statistics": {
"ddl": {
"kind": {
"architect": 14,
"error": 0,
"performance": 13
},
"total": 25
},
"dml": {
"kind": {
"architect": 30,
"error": 0,
"performance": 21
},
"total": 38
},
"common": {
"kind": {
"architect": 7,
"error": 0,
"performance": 0
},
"total": 7
}
}
}
},
"status": "OK"
}
Errors
The holistic.dev API uses conventional HTTP response codes to indicate the success or failure of an API request. In general: Codes in the 2xx range indicate success. Codes in the 4xx range indicate an error that failed given the information provided (e.g., a required parameter was omitted, a charge failed, etc.). Codes in the 5xx range indicate an error with holistic.dev's servers.
Error statuses
200 - OK Everything worked as expected.
400 - Bad Request The request was unacceptable, often due to missing a required parameter.
401 - Unauthorized No valid API key provided.
403 - Forbidden The API key doesn't have permissions to perform the request.
404 - Not Found The requested resource doesn't exist.
409 - Conflict The request conflicts with another request (perhaps due to using the same idempotent key).
429 - Too Many Requests Too many requests hit the API too quickly. We recommend an exponential backoff of your requests.
500 - Server Error Something went wrong on holistic.dev's end.
Error response messages
coming soon