Skip to content

Commit aa58d06

Browse files
committed
Can now tag a GCE instance
1 parent 390a5fd commit aa58d06

28 files changed

+686
-749
lines changed

LICENSE

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
MIT License
22

3-
Copyright (c) 2018 DoiT International
3+
Copyright (c) 2021 DoiT International
44

55
Permission is hereby granted, free of charge, to any person obtaining a copy
66
of this software and associated documentation files (the "Software"), to deal

README.md

+23-77
Original file line numberDiff line numberDiff line change
@@ -6,105 +6,51 @@ In Greek mythology, Iris (/ˈaɪrɪs/; Greek: Ἶρις) is the personification
66

77
Iris helps to automatically assign labels to Google Cloud resources for better manageability and billing reporting. Each resource in Google Cloud will get an automatically generated label in a form of [iris_name:name], [iris_region:region] and finally [iris_zone:zone]. For example if you have a Google Compute Engine instance named `nginx`, Iris will automatically label this instance with [iris_name:nginx], [iris_region:us-central1] and [iris_zone:us-central1-a].
88

9-
Iris will also label short lived Google Compute Engine instances such as preemtible instances or instances managed by Instance Group Manager by listening to Stackdriver Logs and putting required labels on-demand.
9+
Iris will also label short-lived Google Compute Engine instances such as preemptible instances or instances managed by an Instance Group Manager by listening to Operations (Stackdriver) Logs and adding required labels on-demand.
1010

11-
**NOTE**: Iris will try tagging resources in _all_ project across your GCP organization. Not just the project it will be deployed into.
11+
**Supported Google Cloud Products**
1212

13-
## Supported Google Cloud Products
14-
15-
Iris is extensible through plugins and new Google Cloud products may be supported via simply adding a plugin. Right now, there are plugins for the following products:
13+
Iris is extensible through plugins. New Google Cloud products may be supported via simply adding a plugin. Right now, there are plugins for the following products:
1614

1715
* Google Compute Engine (including disks and snapshots)
1816
* Google Cloud Storage
17+
* Google CloudSQL
1918
* Google BigQuery
2019
* Google Bigtable
2120

22-
## Installation
23-
24-
We recommend to deploy Iris in a [separate](https://cloud.google.com/resource-manager/docs/creating-managing-projects#creating_a_project) project within your Google Cloud organization.
25-
To deploy, you will need to have *Owner* role on Iris project and the following roles in your *GCP Organization*:
26-
27-
* _Organization Role Administrator_ - to create a custom IAM role for Iris that allows setting labels on the services
28-
(note this is different from _Organization Administrator_, which is in turn not related to Organization-level _Owner_)
29-
* _Security Admin_ OR _Organization Administrator_ - to allow Iris app engine service account to use the above role
30-
* _Logs Configuration Writer_ OR _Logs Configuration Writer_ - to configure log events stream on Organization level to watch for new instances, databases, etc.
21+
**Installation**
3122

32-
### Install dependencies
23+
We recommend deploying Iris in a [new project](https://cloud.google.com/resource-manager/docs/creating-managing-projects#creating_a_project) within your Google Cloud organization. You will need the following IAM permissions on your Google Cloud organization to complete the deployment:
3324

34-
```
35-
pip2.7 install -r requirements.txt -t lib
36-
```
25+
* App Engine Admin
26+
* Logs Configuration Writer
27+
* Pub/Sub Admin
3728

38-
Yes, we still use Python2.7. Yes, [we know](https://pythonclock.org/).
29+
##### Install dependencies
3930

40-
#### Deploy
31+
`pip install -r requirements.txt -t lib`
4132

42-
```
43-
./deploy.sh <project-id>
44-
```
33+
##### Deploy
34+
`./deploy.sh project-id`
4535

46-
#### Configuration
36+
##### Configuration
4737

48-
Configuration is stored in the config.json file. The file contains two arrays.
38+
Configuration is stored in the `config.json` file. The file contains two arrays.
4939

50-
1. tags - A list of tags that will be applied to the resources (if the corresponding plugin implemented a function `_get_<TAGNAME>()`)
51-
2. on_demand - A List of plugins that will tag whenever a new object of their type is created
40+
1. `tags` - A list of tags that will be applied to the resources (if the plugin impliments a function by the name _get_TAGNAME)
41+
2. `on_demand` - A list of plugins that will tag whenever there is a new object of their type (as opposed to tagging as part of a batch command).
42+
Note: There is no support for CloudSQL for now)
5243

53-
```json
54-
{
55-
"tags": [
56-
"name",
57-
"zone",
58-
"region",
59-
"location",
60-
"instance_type"
61-
],
62-
"on_demand": [
63-
"Gce",
64-
"BigQuery",
65-
"Gcs",
66-
"BigTable",
67-
"GceDisks",
68-
"GceSnapshots"
69-
]
70-
}
71-
```
7244

7345
### Local Development
7446
For local development run:
7547

7648
`dev_appserver.py --log_level=debug app.yaml`
7749

78-
Iris is easily extendable to support tagging of other GCP services. You will need to create a Python file in the /plugin directory with `register_signals`, `def api_name` and `methodsNames` functions as following:
79-
80-
```python
81-
def register_signals(self):
82-
83-
"""
84-
Register with the plugin manager.
85-
"""
86-
87-
logging.debug("BigQuery class created and registering signals")
88-
```
89-
90-
```python
91-
def api_name(self):
92-
return "compute.googleapis.com"
93-
```
94-
95-
```python
96-
// a list of log methods to listen on
97-
def methodsNames(self):
98-
return ["storage.buckets.create"]
99-
```
100-
101-
All plugins are derived form `Plugin` class and needs to implement the following functions:
102-
103-
1. `do_tag(self, project_id)`
104-
1. `get_gcp_object(self, data)`
105-
1. `tag_one(self, gcp_object, project_id)`
106-
1. `api_name(self)`
107-
1. `methodsNames(self)`
50+
## Extension
51+
Iris is easily extendable to support tagging of other GCP services. You will need to create a Python file in the `/plugin` directory, implementing `register_signals`, `def api_name` and `methodsNames` functions as following:
10852

53+
All plugins are derived from `Plugin` class and need to implement the following functions:
10954

110-
Each plugin will execute `gen_labels()` which will loop over all the tags that are defined in the config file and will execute `_get_<TAGNAME>()` function
55+
56+
Each plugin will execute will loop over all the tags that are defined in the config file and will execute the `_get_TAGNAME` function

plugins/bigquery.py alt_plugins/bigquery.py

+9-8
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
""" Taging BQ tabels and datasets."""
22
import logging
33
import traceback
4+
import uuid
45

56
from google.auth import app_engine
67
from googleapiclient import discovery, errors
78
from ratelimit import limits, sleep_and_retry
89

910
from pluginbase import Plugin
10-
from utils import utils
1111

1212
SCOPES = ['https://www.googleapis.com/auth/bigquery']
1313

@@ -35,7 +35,7 @@ def api_name(self):
3535
return "bigquery-json.googleapis.com"
3636

3737

38-
def methodsNames(self):
38+
def method_names(self):
3939
return ["datasetservice.insert", "tableservice.insert"]
4040

4141

@@ -114,12 +114,12 @@ def get_gcp_object(self, data):
114114
return table
115115
except Exception as e:
116116
logging.error(e)
117-
print(traceback.format_exc())
117+
print((traceback.format_exc()))
118118

119119
return None
120120

121121

122-
def do_tag(self, project_id):
122+
def do_label(self, project_id):
123123
"""
124124
tag tables and data sets
125125
:param project_id: project id
@@ -162,7 +162,7 @@ def do_tag(self, project_id):
162162
@limits(calls=35, period=60)
163163
def tag_one_dataset(self, gcp_object):
164164
labels = dict()
165-
labels['labels'] = self.gen_labels(gcp_object)
165+
labels['labels'] = self._gen_labels(gcp_object)
166166
try:
167167
self.bigquery.datasets().patch(
168168
projectId=gcp_object['datasetReference']['projectId'],
@@ -177,14 +177,15 @@ def tag_one_dataset(self, gcp_object):
177177
@limits(calls=35, period=60)
178178
def tag_one_table(self, gcp_object):
179179
labels = dict()
180-
labels['labels'] = self.gen_labels(gcp_object)
180+
labels['labels'] = self._gen_labels(gcp_object)
181181
try:
182+
182183
self.batch.add(self.bigquery.tables().patch(
183184
projectId=gcp_object['tableReference']['projectId'],
184185
body=labels,
185186
datasetId=gcp_object['tableReference']['datasetId'],
186187
tableId=gcp_object['tableReference'][
187-
'tableId']), request_id=utils.get_uuid())
188+
'tableId']), request_id=uuid.uuid4())
188189
self.counter = self.counter + 1
189190
if self.counter == 1000:
190191
self.do_batch()
@@ -193,7 +194,7 @@ def tag_one_table(self, gcp_object):
193194
if self.counter > 0:
194195
self.do_batch()
195196

196-
def tag_one(self, gcp_object, project_id):
197+
def label_one(self, gcp_object, project_id):
197198
try:
198199
if gcp_object['kind'] == "bigquery#dataset":
199200
self.tag_one_dataset(gcp_object)

plugins/bigtable.py alt_plugins/bigtable.py

+17-13
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,13 @@
11
import logging
2+
import uuid
23

34
from google.auth import app_engine
45
from googleapiclient import discovery, errors
56

7+
import utils.gcp_utils
8+
import utils.gcp_utils
69
from pluginbase import Plugin
7-
from utils import gcp, utils
10+
from utils import gcp
811

912
SCOPES = ['https://www.googleapis.com/auth/bigtable.admin']
1013

@@ -32,7 +35,7 @@ def api_name(self):
3235
return "bigtableadmin.googleapis.com"
3336

3437

35-
def methodsNames(self):
38+
def method_names(self):
3639
return [
3740
"google.bigtable.admin.v2.BigtableInstanceAdmin.CreateInstance"]
3841

@@ -61,7 +64,7 @@ def _get_zone(self, gcp_object):
6164
def _get_region(self, gcp_object):
6265
try:
6366
zone = self.get_location(gcp_object, gcp_object['project_id'])
64-
region = gcp.region_from_zone(zone).lower()
67+
region = utils.gcp_utils.region_from_zone(zone).lower()
6568
except KeyError as e:
6669
logging.error(e)
6770
return None
@@ -108,7 +111,7 @@ def get_gcp_object(self, data):
108111
return None
109112

110113

111-
def do_tag(self, project_id):
114+
def do_label(self, project_id):
112115
page_token = None
113116
more_results = True
114117
while more_results:
@@ -121,7 +124,7 @@ def do_tag(self, project_id):
121124
return
122125
if 'instances' in result:
123126
for inst in result['instances']:
124-
self.tag_one(inst, project_id)
127+
self.label_one(inst, project_id)
125128
if 'nextPageToken' in result:
126129
page_token = result['nextPageToken']
127130
else:
@@ -130,24 +133,25 @@ def do_tag(self, project_id):
130133
self.do_batch()
131134

132135

133-
def tag_one(self, gcp_object, project_id):
136+
def label_one(self, gcp_object, project_id):
134137
labels = dict()
135138
gcp_object['project_id'] = project_id
136-
labels['labels'] = self.gen_labels(gcp_object)
139+
labels['labels'] = self._gen_labels(gcp_object)
137140
gcp_object.pop('project_id', None)
138141
if 'labels' in gcp_object:
139-
for key, val in labels['labels'].items():
142+
for key, val in list(labels['labels'].items()):
140143
gcp_object['labels'][key] = val
141144
else:
142145
gcp_object['labels'] = {}
143-
for key, val in labels['labels'].items():
146+
for key, val in list(labels['labels'].items()):
144147
gcp_object['labels'][key] = val
145148

146149
try:
147-
self.batch.add(self.bigtable.projects().instances(
148-
).partialUpdateInstance(
149-
name=gcp_object['name'], body=gcp_object,
150-
updateMask='labels'), request_id=utils.get_uuid())
150+
151+
self.batch.add(
152+
self.bigtable.projects().instances().partialUpdateInstance(
153+
name=gcp_object['name'], body=gcp_object,
154+
updateMask='labels'), request_id= uuid.uuid4())
151155
self.counter = self.counter + 1
152156
if self.counter == 1000:
153157
self.do_batch()

plugins/cloudsql.py alt_plugins/cloudsql.py

+5-6
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@
44
from googleapiclient import discovery, errors
55

66
from pluginbase import Plugin
7-
from utils import utils
87

98
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
109

@@ -49,7 +48,7 @@ def api_name(self):
4948
return "sqladmin.googleapis.com"
5049

5150

52-
def methodsNames(self):
51+
def method_names(self):
5352
return ["cloudsql.instances.create"]
5453

5554

@@ -79,7 +78,7 @@ def get_gcp_object(self, data):
7978
return None
8079

8180

82-
def do_tag(self, project_id):
81+
def do_label(self, project_id):
8382
page_token = None
8483
more_results = True
8584
while more_results:
@@ -92,16 +91,16 @@ def do_tag(self, project_id):
9291
if 'items' not in response:
9392
return
9493
for database_instance in response['items']:
95-
self.tag_one(database_instance, project_id)
94+
self.label_one(database_instance, project_id)
9695
if 'nextPageToken' in response:
9796
page_token = response['nextPageToken']
9897
else:
9998
more_results = False
10099

101100

102-
def tag_one(self, gcp_object, project_id):
101+
def label_one(self, gcp_object, project_id):
103102
labels = dict()
104-
labels['labels'] = self.gen_labels(gcp_object)
103+
labels['labels'] = self._gen_labels(gcp_object)
105104
try:
106105
database_instance_body = dict()
107106
database_instance_body['settings'] = {}

0 commit comments

Comments
 (0)