Skip to content
Snippets Groups Projects
Commit c127e8b0 authored by Lluis Gifre Renom's avatar Lluis Gifre Renom
Browse files

Merge branch 'develop' of ssh://gifrerenom_labs.etsi.org/tfs/controller into...

Merge branch 'develop' of ssh://gifrerenom_labs.etsi.org/tfs/controller into feat/220-update-license-headers-for-release-4
parents 0920848b 68d44d80
No related branches found
No related tags found
2 merge requests!294Release TeraFlowSDN 4.0,!293Resolve "Update license headers for Release 4"
...@@ -4,29 +4,25 @@ ...@@ -4,29 +4,25 @@
The following requirements should be fulfilled before the execuation of Analytics service. The following requirements should be fulfilled before the execuation of Analytics service.
1. A virtual enviornment exist with all the required packages listed in [requirements.in](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/analytics/frontend/requirements.in) sucessfully installed. 1. A virtual enviornment exist with all the required packages listed in [requirements.in](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/analytics/frontend/requirements.in) sucessfully installed.
2. Verify the creation of required database and table. The 2. The Analytics backend service should be running.
[Analytics DB test](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/analytics/tests/test_analytics_db.py) python file lists the functions to create tables and the database. 3. All required Kafka topics must exist. Call `create_all_topics` from the [Kafka class](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/common/tools/kafka/Variables.py) to create any topics that do not already exist.
3. The Analytics backend service should be running.
4. All required Kafka topics must exist. Call `create_all_topics` from the [Kafka class](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/common/tools/kafka/Variables.py) to create any topics that do not already exist.
``` ```
from common.tools.kafka.Variables import KafkaTopic from common.tools.kafka.Variables import KafkaTopic
KafkaTopic.create_all_topics() KafkaTopic.create_all_topics()
``` ```
5. There will be an input stream on the Kafka topic that the Spark Streamer will consume and apply a defined thresholds. 4. There will be an input stream on the Kafka topic that the Streamer class will consume and apply a defined thresholds.
- A JSON encoded string should be generated in the following format: - A JSON encoded string should be generated in the following format:
``` ```
'{"time_stamp": "2024-09-03T12:36:26Z", "kpi_id": "6e22f180-ba28-4641-b190-2287bf448888", "kpi_value": 44.22}' '{"time_stamp": "2024-09-03T12:36:26Z", "kpi_id": "6e22f180-ba28-4641-b190-2287bf448888", "kpi_value": 44.22}'
``` ```
- `kpi_value` should be float or int.
- The Kafka producer key should be the UUID of the Analyzer used when creating it. - The Kafka producer key should be the UUID of the Analyzer used when creating it.
- Use the following Kafka topic to generate the stream: `KafkaTopic.ANALYTICS_RESPONSE.value`. - Generate the stream on the following Kafka topic: `KafkaTopic.ANALYTICS_RESPONSE.value`.
## Steps to create and start Analyzer ## Steps to create and start Analyzer
The analyzer can be declared as below but there are many other ways to declare: The analyzer can be declared as below but there are many other ways to declare:
The given object creation process for `_create_analyzer` involves defining an instance of the `Analyzer` message from the [gRPC definition](https://labs.etsi.org/rep/tfs/controller/-/blob/feat/194-unable-to-correctly-extract-the-aggregation-function-names-from-the-dictionary-received-as/proto/analytics_frontend.proto) and populating its fields. The given object creation process for `_create_analyzer` involves defining an instance of the `Analyzer` message from the [gRPC definition](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/proto/analytics_frontend.proto) and populating its fields.
``` ```
from common.proto.analytics_frontend_pb2 import AnalyzerId from common.proto.analytics_frontend_pb2 import AnalyzerId
...@@ -101,18 +97,5 @@ analytics_client_object = AnalyticsFrontendClient() ...@@ -101,18 +97,5 @@ analytics_client_object = AnalyticsFrontendClient()
analytics_client_object.StartAnalyzer(_create_analyzer_id) analytics_client_object.StartAnalyzer(_create_analyzer_id)
``` ```
### **How to Receive Analyzer Responses** ### **How to Receive Analyzer Response**
- There is a non-gRPC method in the analyzer frontend called `StartResponseListener(<analyzer_uuid>)`. The `analyzer_uuid` is the UUID of the analyzer provided when calling `StartAnalyzer()`. The following code will log the responses: - `GetAlarms(<kpi_id>) -> KpiAlarms` is a method in the `KPI Value Api` that retrieves alarms for a given KPI ID. This method returns a stream of alarms associated with the specified KPI.
```python
from analytics.frontend.service.AnalyticsFrontendServiceServicerImpl import AnalyticsFrontendServiceServicerImpl
analytic_frontend_service_object = AnalyticsFrontendServiceServicerImpl()
for response in analytic_frontend_service_object.StartResponseListener(<analyzer_uuid>):
LOGGER.debug(response)
```
### **Understanding the Output of the Analyzer**
- **Output Column Names**: The output JSON string will include two keys for each defined threshold. For example, the `min_latency` threshold will generate two keys: `min_latency_THRESHOLD_FAIL` and `min_latency_THRESHOLD_RAISE`.
- `min_latency_THRESHOLD_FAIL` is triggered if the average latency calculated within the defined window size is less than the specified threshold range.
- `min_latency_THRESHOLD_RAISE` is triggered if the average latency calculated within the defined window size exceeds the specified threshold range.
- The thresholds `min_latency_THRESHOLD_FAIL` and `min_latency_THRESHOLD_RAISE` will have a value of `TRUE` if activated; otherwise, they will be set to `FALSE`.
...@@ -5,20 +5,14 @@ Ensure the following requirements are met before executing the KPI management se ...@@ -5,20 +5,14 @@ Ensure the following requirements are met before executing the KPI management se
1. A virtual enviornment exist with all the required packages listed in ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/requirements.in) sucessfully installed. 1. A virtual enviornment exist with all the required packages listed in ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/requirements.in) sucessfully installed.
2. Verify the creation of required database and table. The 2. Verify the creation of required database and table. The
[KPI DB test](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/tests/test_kpi_db.py) python file lists the functions to create tables and the database. The [KPI DB test](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/tests/test_kpi_db.py) python file lists the functions to create tables and the database.
[KPI Engine](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/database/KpiEngine.py) file contains the DB string.
### Messages format templates ### Messages format templates
The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/tests/test_messages.py) python file contains templates for creating gRPC messages. The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/tests/test_messages.py) python file contains templates for creating gRPC messages.
### Unit test file ### Flow of execution (Kpi Manager Service functions)
The ["KPI manager test"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_manager/tests/test_kpi_manager.py) python file lists various tests conducted to validate functionality. 1. Call the gRPC method `SetKpiDescriptor(KpiDescriptor)->KpiId` to add the KpiDescriptor to the `Kpi` DB. `KpiDescriptor` and `KpiId` are both pre-defined gRPC message types.
### Flow of execution (Kpi Maanager Service functions) 2. Call `GetKpiDescriptor(KpiId)->KpiDescriptor` to read the `KpiDescriptor` from the DB and `DeleteKpiDescriptor(KpiId)` to delete the `KpiDescriptor` from the DB.
1. Call the `create_database()` and `create_tables()` functions from `Kpi_DB` class to create the required database and table if they don't exist. Call `verify_tables` to verify the existence of KPI table.
2. Call the gRPC method `SetKpiDescriptor(KpiDescriptor)->KpiId` to add the KpiDescriptor to the `Kpi` DB. `KpiDescriptor` and `KpiId` are both pre-defined gRPC message types. 3. Call `SelectKpiDescriptor(KpiDescriptorFilter)->KpiDescriptorList` to get all `KpiDescriptor` objects that matches filter criteria. `KpiDescriptorFilter` and `KpiDescriptorList` are pre-defined gRPC message types.
3. Call `GetKpiDescriptor(KpiId)->KpiDescriptor` to read the `KpiDescriptor` from the DB and `DeleteKpiDescriptor(KpiId)` to delete the `KpiDescriptor` from the DB.
4. Call `SelectKpiDescriptor(KpiDescriptorFilter)->KpiDescriptorList` to get all `KpiDescriptor` objects that matches filter criteria. `KpiDescriptorFilter` and `KpiDescriptorList` are pre-defined gRPC message types.
# How to locally run and test KPI Value API micro-service # How to locally run and test KPI Value API
### Pre-requisets ### Pre-requisets
Ensure the following requirements are met before executing the KPI Value API service. Ensure the following requirements are met before executing the KPI Value API service.
...@@ -7,7 +7,6 @@ Ensure the following requirements are met before executing the KPI Value API ser ...@@ -7,7 +7,6 @@ Ensure the following requirements are met before executing the KPI Value API ser
2. A virtual enviornment exist with all the required packages listed in ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/requirements.in) file sucessfully installed. 2. A virtual enviornment exist with all the required packages listed in ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/requirements.in) file sucessfully installed.
3. Call the ["create_all_topics()"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/common/tools/kafka/Variables.py) function to verify the existence of all required topics on kafka.
### Messages format templates ### Messages format templates
The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/tests/messages.py) python file contains templates for creating gRPC messages. The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/tests/messages.py) python file contains templates for creating gRPC messages.
...@@ -15,9 +14,9 @@ The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi ...@@ -15,9 +14,9 @@ The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi
### Unit test file ### Unit test file
The ["KPI Value API test"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/tests/test_kpi_value_api.py) python file enlist various tests conducted to validate functionality. The ["KPI Value API test"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_api/tests/test_kpi_value_api.py) python file enlist various tests conducted to validate functionality.
### Flow of execution (Kpi Maanager Service functions) ### Flow of execution (Kpi Value Api Service functions)
1. Call the `create_new_topic_if_not_exists(<list of string>)` method to create any new topics if needed. 1. Call `StoreKpiValues(KpiValueList)` to produce `Kpi Value` on a Kafka Topic. (The `KpiValueWriter` microservice will consume and process this `Kpi Value`)
2. Call `StoreKpiValues(KpiValueList)` to produce `Kpi Value` on a Kafka Topic. (The `KpiValueWriter` microservice will consume and process the `Kpi Value`) 2. Call `SelectKpiValues(KpiValueFilter) -> KpiValueList` to read metric from the Prometheus DB.
3. Call `SelectKpiValues(KpiValueFilter) -> KpiValueList` to read metric from the Prometheus DB. 3. Call `GetKpiAlarms(KpiId) -> KpiAlarms` to read alrams from the Kafka.
# How to locally run and test the KPI Value Writer micro-service # How to locally run and test the KPI Value Writer
### Pre-requisets ### Pre-requisets
Ensure the following requirements are meet before executing the KPI Value Writer service> Ensure the following requirements are meet before executing the KPI Value Writer service.
1. The KPI Manger and KPI Value API services are running and Apache Kafka is running. 1. The KPI Manger and KPI Value API services are running.
2. A Virtual enviornment exist with all the required packages listed in the ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_writer/requirements.in) file installed sucessfully. 2. A Virtual enviornment exist with all the required packages listed in the ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_writer/requirements.in) file installed sucessfully.
### Messages format templates ### Messages format templates
The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_writer/tests/test_messages.py) python file contains the templates to create gRPC messages. The ["messages"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_writer/tests/test_messages.py) python file contains the templates to create gRPC messages.
### Unit test file
The ["KPI Value API test"](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/kpi_value_writer/tests/test_kpi_value_writer.py) python file enlist various tests conducted to validate functionality.
### Flow of execution ### Flow of execution
1. Call the `RunKafkaConsumer` method from the `KpiValueWriter` class to start consuming the `KPI Value` generated by the `KPI Value API` or `Telemetry`. For every valid `KPI Value` consumer from Kafka, it invokes the `PrometheusWriter` class to prepare and push the metric to the Promethues DB. 1. The service will be running, consuming KPI values from the Kafka topic, and pushing KPI metrics to Prometheus.
# How to locally run and test Telemetry service # How to locally run and test Telemetry service
### Pre-requisets ### Pre-requisets
The following requirements should be fulfilled before the execuation of Telemetry service. The following requirements should be fulfilled before the execuation of Analytics service.
1. verify that [telmetry_frontend.proto](https://labs.etsi.org/rep/tfs/controller/-/blob/feat/71-cttc-separation-of-monitoring/proto/telemetry_frontend.proto) file exists and grpcs file are generated sucessfully. 1. A virtual enviornment exist with all the required packages listed in [requirements.in](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/telemetry/requirements.in) sucessfully installed.
2. virtual enviornment exist with all the required packages listed in ["requirements.in"](https://labs.etsi.org/rep/tfs/controller/-/blob/feat/71-cttc-separation-of-monitoring/src/telemetry/telemetry_virenv.txt) are installed sucessfully. 2. The Telemetry backend service should be running.
3. verify the creation of required database and table. 3. All required Kafka topics must exist. Call `create_all_topics` from the [Kafka class](https://labs.etsi.org/rep/tfs/controller/-/blob/develop/src/common/tools/kafka/Variables.py) to create any topics that do not already exist.
[DB test](https://labs.etsi.org/rep/tfs/controller/-/blob/feat/71-cttc-separation-of-monitoring/src/telemetry/database/tests/managementDBtests.py) python file enlist the functions to create tables and database. ```
[KPI Engine](https://labs.etsi.org/rep/tfs/controller/-/blob/feat/71-cttc-separation-of-monitoring/src/kpi_manager/service/database/KpiEngine.py) contains the DB string, update the string as per your deployment. from common.tools.kafka.Variables import KafkaTopic
KafkaTopic.create_all_topics()
```
## Steps to create telemetry collector
The collector can be declared as below but there are many other ways to declare:
```
_create_collector_request = telemetry_frontend_pb2.Collector()
_create_collector_request.collector_id.collector_id.uuid = str(uuid.uuid4())
_create_collector_request.kpi_id.kpi_id.uuid = str(uuid.uuid4())
_create_collector_request.duration_s = 100 # in seconds
_create_collector_request.interval_s = 10 # in seconds
```
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment