Skip to content
Snippets Groups Projects
troubleshooting.md 3.72 KiB
Newer Older
  • Learn to ignore specific revisions
  • - [8.1. Release Disk Space](#81-release-disk-space)
    - [8.2. Activate DEBUG mode for microservices and collect logs](#82-activate-debug-mode-for-microservices-and-collect-logs)
    - [8.3. Download a snapshot of the Context database](#83-download-a-snapshot-of-the-context-database)
    
    yangalicace1's avatar
    yangalicace1 committed
    
    ## **8.1. Release Disk Space**
    
    **Page under construction.**
    
    Clean images when your `/var/lib/docker` starts taking up too much space:
    
    ```bash
    docker system prune -a
    ```
    
    ## **8.2. Activate DEBUG mode for microservices and collect logs**
    
    In some cases might happen that some component is not working or reporting errors.
    In those cases it makes sense to activate DEBUG mode on those components and collect the logs.
    
    
    <h3><u>Activate DEBUG mode in components</h3></u>
    
    Before deploying the TeraFlowSDN, in the [manifests](https://labs.etsi.org/rep/tfs/controller/-/tree/master/manifests) folder, modify the appropriate files for the microservices to be inspected, e.g. `contextservice.yaml`, `deviceservice.yaml`, `serviceservice.yaml`, `pathcompservice.yaml`, and `nbiservice.yaml`, by changing environment variable `LOG_LEVEL` to `DEBUG`.
    
    ```yaml
    apiVersion: apps/v1
    kind: Deployment
    #...
    spec:
      # ...
      template:
        # ...
        spec:
          # ...
          containers:
            # ...
            - name: server
              # ...
              env:
                # ...
                - name: LOG_LEVEL
    
                  value: "INFO" # change to "DEBUG"F
    
    yangalicace1's avatar
    yangalicace1 committed
                # ...
    ```
    
    <h3><u>Redeploy TeraFlowSDN</h3></u>
    
    Redeploy TeraFlowSDN as usual using the example `my_deploy.sh` specifications or whatever file you created with your deploy specs.
    ```bash
    source my_deploy.sh
    ./deploy/all.sh
    ```
    Wait for the deployment to finish.
    
    <h3><u>Use TeraFlowSDN</h3></u>
    
    Do whatever actions you were testing and were missbehaving, such as onboarding a topology, creating a connectivity service, etc.
    
    <h3><u>Collect log files</h3></u>
    
    A number of helper scripts, named as `show_logs_<component>.sh`, are provided to facilitate log collection in the [scripts](https://labs.etsi.org/rep/tfs/controller/-/tree/master/scripts) folder.
    These scripts dump, by default, the logs on the screen, but can be redirected to files when needed.
    
    In the following example, the logs of `context`, `device`, `service`, `pathcomp-frontend`, and `nbi` are stored in respective log files instead of the screen.
    
    ```bash
    cd ~/tfs-ctrl
    ./scripts/show_logs_context.sh > context.log
    ./scripts/show_logs_device.sh > device.log
    ./scripts/show_logs_service.sh > service.log
    ./scripts/show_logs_pathcomp_frontend.sh > pathcomp_frontend.log
    ./scripts/show_logs_nbi.sh > nbi.log
    ```
    
    The resulting logs will be stored in the root TeraFlowSDN folder.
    > 
    
    ## **8.3. Download a snapshot of the Context database**
    
    In the WebUI, there is a tab named as `Debug` that you can use to interrogate the Context database.
    In particular, there is a link named as `Dummy Contexts`. This link produces a JSON descriptor file containing all the contexts, topologies, devices, links, slices, services, connections, constraints, and configuration rules present in Context. The resulting file can be onboarded in a blank TeraFlowSDN instance for testing purposes.
    
    **IMPORTANT**: The Dummy Contexts feature might take few seconds to respond while it composes the reply.
    
    **IMPORTANT**: The produced file is labelled as `"dummy": true`; that means it is a snapshot of the database that can be loaded directly into Context. This means it does not pass through the Device/Service/Slice components to onboard the records, but it drops the records directly in Context. This is useful to investigate the content on the database when an issue arises.
    
    **WARNING**: The dump retrieves all the information in clear text! Remember to **manually anonymize your sensitive data** such as credentials, IP addresses, etc.