Skip to content
Snippets Groups Projects
Commit 5eed4743 authored by Waleed Akbar's avatar Waleed Akbar
Browse files

Changes in Analytics.

- UNSPECIFIED option added in the "AnalyzerOperationMode" enum as a best practice.
- In SparkStreamer, changed the thresholds parameter from optional to compulsory.
parent 990395f4
No related branches found
No related tags found
2 merge requests!294Release TeraFlowSDN 4.0,!261(CTTC) New Analytics Component
......@@ -30,8 +30,9 @@ message AnalyzerId {
}
enum AnalyzerOperationMode {
ANALYZEROPERATIONMODE_BATCH = 0;
ANALYZEROPERATIONMODE_STREAMING = 1;
ANALYZEROPERATIONMODE_UNSPECIFIED = 0;
ANALYZEROPERATIONMODE_BATCH = 1;
ANALYZEROPERATIONMODE_STREAMING = 2;
}
message Analyzer {
......
......@@ -73,7 +73,7 @@ def ApplyThresholds(aggregated_df, thresholds):
)
return aggregated_df
def SparkStreamer(kpi_list, oper_list, window_size=None, win_slide_duration=None, thresholds=None, time_stamp_col=None):
def SparkStreamer(kpi_list, oper_list, thresholds, window_size=None, win_slide_duration=None, time_stamp_col=None):
"""
Method to perform Spark operation Kafka stream.
NOTE: Kafka topic to be processesd should have atleast one row before initiating the spark session.
......@@ -86,7 +86,6 @@ def SparkStreamer(kpi_list, oper_list, window_size=None, win_slide_duration=None
if window_size is None: window_size = "60 seconds" # default
if win_slide_duration is None: win_slide_duration = "30 seconds" # default
if time_stamp_col is None: time_stamp_col = "time_stamp" # default
if thresholds is None: thresholds = {} # No threshold will be applied
try:
# Read data from Kafka
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment