Skip to content

Commit

Permalink
[automated commit] Bump docs to version 1.18.0
Browse files Browse the repository at this point in the history
  • Loading branch information
leorossi authored and github-actions[bot] committed Jan 19, 2024
1 parent 5ce142f commit 2f456f3
Show file tree
Hide file tree
Showing 158 changed files with 186 additions and 60 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ A Platformatic Service is an HTTP server based on [Fastify](https://www.fastify.
With Platformatic Service you can:
- Add custom functionality in a [Fastify plugin](https://fastify.dev/docs/latest/Reference/Plugins)
- Write plugins in JavaScript or [TypeScript](https://www.typescriptlang.org/)
- Optionally user TypeScript to write your application code
- Optionally use TypeScript to write your application code

A Platformatic Service is the basic building block of a Platformatic application.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Create a `platformatic.db.json` file in the root project, it will be loaded auto
- Once Platformatic DB starts, its API will be available at `http://127.0.0.1:3042`
- It will connect and read the schema from a PostgreSQL DB
- Will read migrations from `./migrations` directory
- Will load custom functionallity from `./plugin.js` file.
- Will load custom functionality from `./plugin.js` file.
## Database and Migrations

Start the database using the sample `docker-compose.yml` file.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@ curl http://localhost:3042/v2/users

### Generating the mapper plugin with OpenAI

Platformatic supports generating the mapper plugin automatically using the OpenAI. Before doing make sure that you have a platformatic user API key. If not you can generate it by running the following command:
Platformatic supports generating the mapper plugin automatically using the OpenAI. Before doing so, make sure you have a platformatic user API key. If not, you can generate it by running the following command:

```bash
npx platformatic@latest login
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1187,13 +1187,13 @@ Which should give us a response like this:
[{"id":1,"title":"Maximum Overdrive","directorId":1,"producerId":4,"releasedYear":1986,"createdAt":"1687996711612","updatedAt":"1687996711612","directorName":"Stephen King","producerName":"Martha Schumacher"},{"id":2,"title":"The Shining","directorId":5,"producerId":1,"releasedYear":1980,"createdAt":"1687996711619","updatedAt":"1687996711619","directorName":"Mick Garris","producerName":"Stephen King"},{"id":3,"title":"Kajillionaire","directorId":2,"producerId":6,"releasedYear":2020,"createdAt":"1687996711621","updatedAt":"1687996711621","directorName":"Miranda July","producerName":"Dede Gardner"}]
```

Our Library app is now succesfully running in production! 🎉
Our Library app is now successfully running in production! 🎉

### Automate deployment with GitHub Actions

If we want to automate pull request preview and production deployments of our app to Platformatic Cloud, we can do it with GitHub Actions by:

1. Creating a new repository on GitHub, then commiting and push up the code for our Library app.
1. Creating a new repository on GitHub, then committing and push up the code for our Library app.
2. Following the [Cloud Quick Start Guide](https://docs.platformatic.cloud/docs/quick-start-guide/?utm_campaign=Build%20and%20deploy%20a%20modular%20monolith%20with%20Platformatic&utm_medium=blog&utm_source=Platformatic%20Blog) to configure the deployment for our app. We can skip the step for creating a GitHub repository.

## Next steps
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Older Platformatic applications might not have the same layout, if so you can up
## Compiling for deployment

Compiling for deployment is then as easy as running `plt service compile` in that same folder.
Rememeber to set `PLT_TYPESCRIPT=false` in your environment variables in the deployed environments.
Remember to set `PLT_TYPESCRIPT=false` in your environment variables in the deployed environments.

## Usage with Runtime

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ deployment guide.
## Adding `sqlite` for debugging

With a combination of Docker and Fly.io, you can create an easy way to debug
your sqlite aplication without stopping your application or exporting the data.
your sqlite application without stopping your application or exporting the data.
At the end of this guide, you will be able to run `fly ssh console -C db-cli` to
be dropped into your remote database.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ This guide

## Configure the new Platformatic app

documentation to create a new Platformatic app. Every Platformatic app uses the "Movie" demo entity and includes
Every Platformatic app uses the "Movie" demo entity and includes
the corresponding table, migrations, and REST API to create, read, update, and delete movies.

Once the new Platformatic app is ready:
Expand Down Expand Up @@ -179,7 +179,7 @@ cd rest-api-frontend/src
npx platformatic client http://127.0.0.1:3042 --frontend --name foobar --language ts
```

will generated `foobar.ts` and `foobar-types.d.ts`
This will generate `foobar.ts` and `foobar-types.d.ts`


## React and Vue.js components that read, create, and update an entity
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ npm i pino-elasticsearch

## Configure Logger Transport

Configuring your platformatic application to log to ElasticSearch is straighforward,
Configuring your Platformatic application to log to ElasticSearch is straightforward,
you just have to configure it like the following:

```json
Expand All @@ -75,7 +75,7 @@ you just have to configure it like the following:
```

This snippet can be applied either to the `platformatic.runtime.json` config
for Platformatic Runtime applications, or as part of the applicaiton configuration
for Platformatic Runtime applications, or as part of the application configuration
for any other application.

This setup will allow you to log both to the terminal (TTY)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ Copy over the `routes` directory from your Express app.

### Install @fastify/express

Install the [`@fastify/express`](https://www.npmjs.com/package/@fastify/express) Fastify plugin to add full Express compability to your Platformatic Service app:
Install the [`@fastify/express`](https://www.npmjs.com/package/@fastify/express) Fastify plugin to add full Express compatibility to your Platformatic Service app:

```bash
npm install @fastify/express
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -409,7 +409,7 @@ Platformatic Service also provides many other features that are built on top of

- Metrics with [`fastify-metrics`](https://www.npmjs.com/package/fastify-metrics)
- Healthcheck endpoint with [`@fastify/under-pressure`](https://github.com/fastify/under-pressure)
- OpenAPI specification and Swagger UI with [`@fastify/swagger`](https://www.npmjs.com/package/@fastify/swagger) and [`@fastify/swagger-ui`](https://www.npmjs.com/package/@fastify/swagger-ui)
- OpenAPI specification and Scalar with [`@fastify/swagger`](https://www.npmjs.com/package/@fastify/swagger) and [`@scalar/fastify-api-reference`](https://www.npmjs.com/package/@scalar/fastify-api-reference)
- GraphQL API support with [`mercurius`](https://www.npmjs.com/package/mercurius)
- CORS support with [`@fastify/cors`](https://github.com/fastify/fastify-cors)
- Configuration with environment variable validation
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,22 @@ Platformatic can be configured to expose Prometheus metrics:
In this case, we are exposing the metrics on port 9091 (defaults to `9090`), and we are using basic authentication to protect the endpoint.
We can also specify the IP address to bind to (defaults to `0.0.0.0`).
Note that the metrics port is not the default in this configuration. This is because if you want to test the integration running both Prometheus and Platformatic on the same host, Prometheus starts on `9090` port too.

Prometheus recommends using a port different from the main application port for serving the metrics. But, it is possible to serve metrics on the same port as the application by setting `"server": "parent"` in the `metrics` configuration. It is also possible to change the endpoint on which metrics are being served by passing the `endpoint` property. The following example configuration illustrates this:

```json
...
"metrics": {
"server": "parent",
"endpoint": "/platformatic-app-metrics",
"auth": {
"username": "platformatic",
"password": "mysecret"
}
}
...
```

All the configuration settings are optional. To use the default settings, set `"metrics": true`. See the [configuration reference](/reference/db/configuration.md#metrics) for more details.

:::caution
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ The snippet above defines a `Post` model with the following fields and propertie
- `viewCount`: An `Int` field with a default value of 0.
- `createdAt`: A `DateTime` field with a timestamp of when the value is created as its default value.

By default, Prisma maps the model name and its format to the table name — which is also used im Prisma Client. Platformatic DB uses a snake casing and pluralized table names to map your table names to the generated API. The `@@map()` attribute in the Prisma schema allows you to define the name and format of your table names to be used in your database. You can also use the `@map()` attribute to define the format for field names to be used in your database. Refer to the [Foreign keys and table names naming conventions](#foreign-keys-and-table-names-naming-conventions) section to learn how you can automate formatting foreign keys and table names.
By default, Prisma maps the model name and its format to the table name — which is also used in Prisma Client. Platformatic DB uses a snake casing and pluralized table names to map your table names to the generated API. The `@@map()` attribute in the Prisma schema allows you to define the name and format of your table names to be used in your database. You can also use the `@map()` attribute to define the format for field names to be used in your database. Refer to the [Foreign keys and table names naming conventions](#foreign-keys-and-table-names-naming-conventions) section to learn how you can automate formatting foreign keys and table names.

Next, run the following command to generate an up and down migration:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Note that the role of an admin user from `adminSecret` strategy is `platformatic

## Read-only access to _anonymous_ users

The following configuration will allo all _anonymous_ users (e.g. each user without a known role)
The following configuration will allow all _anonymous_ users (e.g. each user without a known role)
to access the `pages` table / `page` entity in Read-only mode:


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ Open `platformatic.composer.js` and change it to the following:
}
```
Note that we just added `test-service` as `origin` of the proxed service and added the usual `telementry` configuration, with a different `serviceName`.
Note that we just added `test-service` as `origin` of the proxied service and added the usual `telemetry` configuration, with a different `serviceName`.
Finally, start the composer:
Expand Down Expand Up @@ -267,7 +267,7 @@ You can then click on the trace and see the details:
![image](./telemetry-images/jaeger-3.png)
Note that everytime a request is received or client call is done, a new span is started. So we have:
Note that every time a request is received or client call is done, a new span is started. So we have:
- One span for the request received by the `test-composer`
- One span for the client call to `test-service`
- One span for the request received by `test-service`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ the same credentials to deploy to one.
A workspace can be either static or dynamic.
A static workspace always deploy to the same domain, while
in a dynamic workspace each deployment will have its own domain.
The latter are useful to provde for pull request previews.
The latter are useful to provide for pull request previews.

### Can I change or upgrade my plan after I start using Platformatic?

Expand All @@ -35,4 +35,4 @@ Plans can be changed or upgraded at any time
### What does it mean I can set my own CNAME?

Free applications only gets a `*.deploy.space` domain name to access
their application. All other plans can set it to a domain of their chosing.
their application. All other plans can set it to a domain of their choosing.
Original file line number Diff line number Diff line change
Expand Up @@ -192,9 +192,9 @@ Once the GitHub Actions deployment workflow has completed, go to the `production
for your app in Platformatic Cloud. Click on the link for the **Entry Point**. You should now
see the Platformatic DB app home page.

Click on the **OpenAPI Documentation** link to try out your app's REST API using the Swagger UI.
Click on the **OpenAPI Documentation** link to try out your app's REST API using Scalar.

![Screenshot of Swagger UI for a Platformatic DB app](./images/quick-start-guide/platformatic-db-swagger-ui.png)
![Screenshot of Scalar for a Platformatic DB app](./images/quick-start-guide/platformatic-db-scalar.png)

## Preview pull request changes

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,15 +50,15 @@ When an application is deployed on the cloud, it uses [opentelemetry](https://op

![Platformatic App and Risk engine](./images/risk-engine/service-risk-engine.png)

The risk engine collects automatically data about the service calls (meaning that no actual data exchanged by services are collected, just the service name, i.e. the `path` and the `method` of the calls) in the form of open telemtry `traces` of `spans`. See [Opentelemetry documentation](https://opentelemetry.io/docs/concepts/signals/traces) about the details on how the informations are sent form the services to the Risk Engine (or any other Open Telemetry backend).
The risk engine collects automatically data about the service calls (meaning that no actual data exchanged by services are collected, just the service name, i.e. the `path` and the `method` of the calls) in the form of open telemetry `traces` of `spans`. See [Opentelemetry documentation](https://opentelemetry.io/docs/concepts/signals/traces) about the details on how the informations are sent form the services to the Risk Engine (or any other Open Telemetry backend).

As said, this happens automatically and transparently in Platformatic Cloud for both OpenAPI and GraphQL services.

:::info
In the GraphQL we have only one HTTP endpoint (which is by default `POST/graphql`). In this case we assume that an operation is actually the type of the graphQL request (e.g. `query` or `mutation`) and the name of the query.
:::

When a PR is created, the risk engine is triggered and it calculates the risk of the change. This is done automatically by the Platformatic Github Action if they has been created as explaine in the [Cloud Quick Start Guide](./quick-start-guide), that:
When a PR is created, the risk engine is triggered and it calculates the risk of the change. This is done automatically by the Platformatic Github Actions if they have been created as explained in the [Cloud Quick Start Guide](./quick-start-guide), that:
- Calculates the operations changed by the PR
- Ask to the risk engine the risk of that change

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -351,10 +351,25 @@ platformatic composer <command>
```


#### create

Creates a new Platformatic Composer application.

Options are

* `dir <string>` - the directory where to create the project (Default: `process.cwd() + 'platformatic-composer'`)
* `port <string>` - the port where the application will listen (Default: `3042`)
* `hostname <string>` - the hostname where the application will listen (Default: `0.0.0.0`)
* `git <boolean>` - Init the git repository (Default: `true`)
* `typescript <boolean>` - Use Typescript (Default: `false`)
* `install <boolean>` - Run or not `npm install` after creating the files (Default: `true`)
* `plugin <boolean>` - Creates a sample plugin and tests (Default: `true`)

#### help

Available commands:

* `create` - creates a new Platformatic Composer application.
* `help` - show this help message.
* `help <command>` - shows more information about a command.
* `start` - start the server.
Expand Down Expand Up @@ -460,6 +475,22 @@ You can find more details about the configuration format here:
* [Platformatic DB Configuration](https://docs.platformatic.dev/docs/reference/db/configuration)


#### create

Creates a new Platformatic DB application.

Options are

* `dir <string>` - the directory where to create the project (Default: `process.cwd() + 'platformatic-composer'`)
* `port <string>` - the port where the application will listen (Default: `3042`)
* `hostname <string>` - the hostname where the application will listen (Default: `0.0.0.0`)
* `connectionString <string>` - the connection string for your database (Default: `sqlite://./db.sqlite`)
* `migrations <boolean>` - Creates sample migrations (Default: `true`)
* `git <boolean>` - Init the git repository (Default: `true`)
* `typescript <boolean>` - Use Typescript (Default: `false`)
* `install <boolean>` - Run or not `npm install` after creating the files (Default: `true`)
* `plugin <boolean>` - Creates a sample plugin and tests (Default: `true`)

#### help

Available commands:
Expand Down Expand Up @@ -757,10 +788,25 @@ You can find more details about the configuration format here:
* [Platformatic Service Configuration](https://docs.platformatic.dev/docs/reference/service/configuration)


#### create

Creates a new Platformatic Service application.

Options are

* `dir <string>` - the directory where to create the project (Default: `process.cwd() + 'platformatic-composer'`)
* `port <string>` - the port where the application will listen (Default: `3042`)
* `hostname <string>` - the hostname where the application will listen (Default: `0.0.0.0`)
* `git <boolean>` - Init the git repository (Default: `true`)
* `typescript <boolean>` - Use Typescript (Default: `false`)
* `install <boolean>` - Run or not `npm install` after creating the files (Default: `true`)
* `plugin <boolean>` - Creates a sample plugin and tests (Default: `true`)

#### help

Available commands:

* `create` - creates a new Platformatic Service application.
* `help` - show this help message.
* `help <command>` - show more information about a command.
* `start` - start the server.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -298,15 +298,15 @@ fastify.post('/', async (request, reply) => {
fastify.listen({ port: 3000 })
```

Note that you would need to install `@platformatic/client` as a depedency.
Note that you would need to install `@platformatic/client` as a dependency.

## How are the method names defined in OpenAPI

The names of the operations are defined in the OpenAPI specification.
Specifically, we use the [`operationId`](https://swagger.io/specification/).
If that's not part of the spec,
the name is generated by combining the parts of the path,
like `/something/{param1}/` and a method `GET`, it genertes `getSomethingParam1`.
like `/something/{param1}/` and a method `GET`, it generates `getSomethingParam1`.

## Authentication

Expand Down
Loading

0 comments on commit 2f456f3

Please sign in to comment.