|
1 | 1 | 
|
2 | 2 |
|
3 |
| -- [Giess den Kiez API](#giess-den-kiez-api) |
4 |
| - - [W.I.P. API Migration](#wip-api-migration) |
5 |
| - - [Prerequisites](#prerequisites) |
6 |
| - - [Setup](#setup) |
7 |
| - - [Supabase (local)](#supabase-local) |
8 |
| - - [Environments and Variables](#environments-and-variables) |
9 |
| - - [Vercel](#vercel) |
10 |
| - - [Vercel Environment Variables](#vercel-environment-variables) |
11 |
| - - [API Routes /v3](#api-routes-v3) |
12 |
| - - [](#) |
13 |
| - - [API Authorization](#api-authorization) |
14 |
| - - [Supabase](#supabase) |
15 |
| - - [Tests](#tests) |
16 |
| - - [Supabase](#supabase-1) |
17 |
| - - [Migrations and Types](#migrations-and-types) |
18 |
| - - [Deployment](#deployment) |
19 |
| - - [Radolan Harvester](#radolan-harvester) |
20 |
| - - [OSM Pumpen Harvester](#osm-pumpen-harvester) |
21 |
| - - [API Routes](#api-routes) |
22 |
| - - [API Authorization](#api-authorization-1) |
23 |
| - - [Supabase](#supabase-2) |
24 |
| - - [Tests](#tests-1) |
25 |
| - - [Contributors ✨](#contributors-) |
26 |
| - - [Credits](#credits) |
27 |
| - |
28 |
| -# Giess den Kiez API |
29 |
| - |
30 |
| -Built with Typescript, connects to Supabase and runs on vercel.com. |
| 3 | +# Giess den Kiez API based on Supabase |
| 4 | + |
| 5 | +Supabase setup for Giess den Kiez |
31 | 6 |
|
32 | 7 | 🚨 Might become part of the [giessdenkiez-de](https://github.com/technologiestiftung/giessdenkiez-de) repo eventually.
|
33 | 8 |
|
34 | 9 | ## Prerequisites
|
35 | 10 |
|
36 |
| -- [Vercel.com](https://vercel.com) Account |
37 | 11 | - [Supabase](https://supabase.com) Account
|
38 | 12 | - Supabase CLI install with brew `brew install supabase/tap/supabase`
|
39 | 13 | - [Docker](https://www.docker.com/) Dependency for Supabase
|
@@ -74,61 +48,6 @@ In the example code above the Postgres database Postgrest API are run locally. Y
|
74 | 48 |
|
75 | 49 | Again. Be a smart developer, read https://12factor.net/config, https://github.com/motdotla/dotenv#should-i-have-multiple-env-files and never ever touch production with your local code!
|
76 | 50 |
|
77 |
| -### Vercel |
78 |
| - |
79 |
| -Setup your Vercel.com account. You might need to login. Run `npx vercel login` in your shell. You will have to link your local project to a vercel project by running `npx vercel link` and follow the instructions or deploy your application with `npx vercel`. This will create a new project on vercel.com and deploy the application. |
80 |
| - |
81 |
| -##### Vercel Environment Variables |
82 |
| - |
83 |
| -Add all your environment variables to the Vercel project by running the commands below. The cli will prompt for the values as input and lets you select if they should be added to `development`, `preview` and `production`. For local development you can overwrite these value with an `.env` file in the root of your project. It is wise to have one Supabase project for production and one for preview. The preview will then be used in deployment previews on GitHub. You can connect your vercel project with your GitHub repository on the vercel backend. |
84 |
| - |
85 |
| -```bash |
86 |
| -# the master key for supabase |
87 |
| -vercel env add SUPABASE_SERVICE_ROLE_KEY |
88 |
| -# the url to your supabase project |
89 |
| -vercel env add SUPABASE_URL |
90 |
| -# the anon key for supabase |
91 |
| -vercel env add SUPABASE_ANON_KEY |
92 |
| -# the max rows allowed to fetch from supabase (default 1000) |
93 |
| -vercel env add SUPABASE_MAX_ROWS |
94 |
| -``` |
95 |
| - |
96 |
| -To let these variables take effect you need to deploy your application once more. |
97 |
| - |
98 |
| -```bash |
99 |
| -vercel --prod |
100 |
| -``` |
101 |
| - |
102 |
| -## API Routes /v3 |
103 |
| - |
104 |
| -There are 3 main routes `/v3/get`, `/v3/post` and `/v3/delete`. |
105 |
| - |
106 |
| -On the `/get` route all actions are controlled by passing URL params. On the `/post` and `/delete` route you will have to work with additional POST bodies. For example to fetch a specific tree run the following command. |
107 |
| - |
108 |
| -```bash |
109 |
| -curl --request GET \ |
110 |
| - --url 'http://localhost:3000/get/byid&id=_123456789' \ |
111 |
| - |
112 |
| -``` |
113 |
| - |
114 |
| -You can see all the available routes in the [docs/api.http](./docs/api.http) file with all their needed `URLSearchParams` and JSON bodies or by inspecting the JSON Schema that is returned when you do a request to the `/get`, `/post` or `/delete` route. |
115 |
| - |
116 |
| -Currently we have these routes |
117 |
| - |
118 |
| -| `/v3/get` | `/v3/post` | `/v3/delete` | |
119 |
| -| :------------------- | :--------- | :----------- | |
120 |
| -| `/byid` | `/adopt` | `/unadopt` | |
121 |
| -| `/treesbyids` | `/water` | `/unwater` | |
122 |
| -| `/adopted` | | | |
123 |
| -| `/istreeadopted` | | | |
124 |
| -| `/wateredandadopted` | | | |
125 |
| -| `/lastwatered` | | | |
126 |
| -| `/wateredbyuser` | | | |
127 |
| - |
128 |
| -### |
129 |
| - |
130 |
| -### API Authorization |
131 |
| - |
132 | 51 | ### Supabase
|
133 | 52 |
|
134 | 53 | You can sign up with the request below. You will get an access token to use in your requests.
|
@@ -187,83 +106,6 @@ On CI the Supabase is started automagically. See [.github/workflows/tests.yml](.
|
187 | 106 | - **(Not recommended but possible)** Link your local project directly to the remote `supabase link --project-ref <YOUR PROJECT REF>` (will ask you for your database password from the creation process)
|
188 | 107 | - **(Not recommended but possible)** Push your local state directly to your remote project `supabase db push` (will ask you for your database password from the creation process)
|
189 | 108 |
|
190 |
| -### Radolan Harvester |
191 |
| - |
192 |
| -if you want to use the [DWD Radolan harvester](https://github.com/technologiestiftung/giessdenkiez-de-dwd-harvester) you need to prepare some data in your database |
193 |
| - |
194 |
| -- Update the table `radolan_harvester` with a time range for the last 30 days |
195 |
| - |
196 |
| -```sql |
197 |
| -INSERT INTO "public"."radolan_harvester" ("id", "collection_date", "start_date", "end_date") |
198 |
| - VALUES (1, ( |
199 |
| - SELECT |
200 |
| - CURRENT_DATE - INTEGER '1' AS yesterday_date), |
201 |
| - ( |
202 |
| - SELECT |
203 |
| - ( |
204 |
| - SELECT |
205 |
| - CURRENT_DATE - INTEGER '31')::timestamp + '00:50:00'), |
206 |
| - ( |
207 |
| - SELECT |
208 |
| - ( |
209 |
| - SELECT |
210 |
| - CURRENT_DATE - INTEGER '1')::timestamp + '23:50:00')); |
211 |
| -``` |
212 |
| - |
213 |
| -- Update the table `radolan_geometry` with sql file [radolan_geometry.sql](sql/radolan_geometry.sql) This geometry is Berlin only. |
214 |
| -- Populate the table radolan_data with the content of [radolan_data.sql](sql/radolan_data.sql) |
215 |
| - |
216 |
| -This process is actually a little blackbox we need to solve. |
217 |
| - |
218 |
| -### OSM Pumpen Harvester |
219 |
| - |
220 |
| -The [giessdenkiez-de](https://github.com/technologiestiftung/giessdenkiez-de) repository fetches Pumpen data from Supabase via a Github Action defined in [pumps.yml](https://github.com/technologiestiftung/giessdenkiez-de/blob/master/.github/workflows/pumps.yml). The data is pushed to a Supabase bucket `data_assets`. For local development, it is created via [seed.sql](supabase/seed.sql). For deployments, the bucket needs to be created: |
221 |
| - |
222 |
| -``` |
223 |
| --- Create the public data_assets bucket |
224 |
| -INSERT INTO storage.buckets(id, name) |
225 |
| - VALUES ('data_assets', 'data_assets'); |
226 |
| -
|
227 |
| -CREATE POLICY "Public Access" ON storage.objects |
228 |
| - FOR SELECT |
229 |
| - USING (bucket_id = 'data_assets'); |
230 |
| -
|
231 |
| -UPDATE |
232 |
| - "storage".buckets |
233 |
| -SET |
234 |
| - "public" = TRUE |
235 |
| -WHERE |
236 |
| - buckets.id = 'data_assets'; |
237 |
| -``` |
238 |
| - |
239 |
| -## API Routes |
240 |
| - |
241 |
| -There are 3 main routes `/get`, `/post` and `/delete`. |
242 |
| - |
243 |
| -On the `/get` route all actions are controlled by passing URL params. On the `/post` and `/delete` route you will have to work with additional POST bodies. For example to fetch a specific tree run the following command. |
244 |
| - |
245 |
| -```bash |
246 |
| -curl --request GET \ |
247 |
| - --url 'http://localhost:8080/get/byid&id=_123456789' \ |
248 |
| - |
249 |
| -``` |
250 |
| - |
251 |
| -You can see all the available routes in the [docs/api.http](./docs/api.http) file with all their needed `URLSearchParams` and JSON bodies or by inspecting the JSON Schema that is returned when you do a request to the `/get`, `/post` or `/delete` route. |
252 |
| - |
253 |
| -Currently we have these routes |
254 |
| - |
255 |
| -| `/v3/get` | `/v3/post` | `/v3/delete` | |
256 |
| -| -------------------- | ---------- | ------------ | |
257 |
| -| `/byid` | `/adopt` | `/unadopt` | |
258 |
| -| `/treesbyids` | `/water` | `/unwater` | |
259 |
| -| `/adopted` | | | |
260 |
| -| `/istreeadopted` | | | |
261 |
| -| `/wateredandadopted` | | | |
262 |
| -| `/lastwatered` | | | |
263 |
| -| `/wateredbyuser` | | | |
264 |
| - |
265 |
| -### API Authorization |
266 |
| - |
267 | 109 | #### Supabase
|
268 | 110 |
|
269 | 111 | Some of the requests need a authorized user. You can create a new user using email password via the Supabase API.
|
|
0 commit comments