Uploaded image for project: 'Couchbase Server'
  1. Couchbase Server
  2. MB-51813

Add logging to goport

    XMLWordPrintable

Details

    • 1

    Description

      It was observed in CBSE-11735 that when we issue a "shutdown" of the indexer process it takes around ~15 secs for the entire service shutdown to complete.

      7110:[ns_server:debug,2022-04-04T16:41:37.386Z,babysitter_of_ns_1@cb.local:<0.18165.101>:ns_port_server:terminate:196]Shutting down port indexer
       
      7111-[ns_server:debug,2022-04-04T16:41:58.020Z,babysitter_of_ns_1@cb.local:indexer-goport<0.18171.101>:goport:handle_eof:582]Stream 'stdout' closed
       
      7112-[ns_server:debug,2022-04-04T16:41:58.020Z,babysitter_of_ns_1@cb.local:indexer-goport<0.18171.101>:goport:handle_eof:582]Stream 'stderr' closed
       
      7113-[ns_server:info,2022-04-04T16:41:58.020Z,babysitter_of_ns_1@cb.local:indexer-goport<0.18171.101>:goport:handle_process_exit:563]Port exited with status 0.
       
      7114-[ns_server:debug,2022-04-04T16:41:58.020Z,babysitter_of_ns_1@cb.local:<0.18165.101>:ns_port_server:terminate:199]indexer has exited
      

      Interestingly this is not a one off occurrence and I see similar abnormally large shutdown times for indexer in the past (for the same node) and other nodes too.

      rg "Shutting down port indexer" ns_server.babysitter.log -A 4
       
      5554:[ns_server:debug,2022-03-21T21:59:48.933Z,babysitter_of_ns_1@cb.local:<0.1615.0>:ns_port_server:terminate:196]Shutting down port indexer
       
      5555-[ns_server:debug,2022-03-21T22:00:08.311Z,babysitter_of_ns_1@cb.local:indexer-goport<0.1621.0>:goport:handle_eof:582]Stream 'stdout' closed
       
      5556-[ns_server:debug,2022-03-21T22:00:08.311Z,babysitter_of_ns_1@cb.local:indexer-goport<0.1621.0>:goport:handle_eof:582]Stream 'stderr' closed
       
      5557-[ns_server:info,2022-03-21T22:00:08.311Z,babysitter_of_ns_1@cb.local:indexer-goport<0.1621.0>:goport:handle_process_exit:563]Port exited with status 0.
       
      5558-[ns_server:debug,2022-03-21T22:00:08.312Z,babysitter_of_ns_1@cb.local:<0.1615.0>:ns_port_server:terminate:199]indexer has exited
       
      --
       
      6701:[ns_server:debug,2022-04-03T06:02:40.160Z,babysitter_of_ns_1@cb.local:<0.4129.70>:ns_port_server:terminate:196]Shutting down port indexer
       
      6702-[ns_server:debug,2022-04-03T06:02:54.942Z,babysitter_of_ns_1@cb.local:indexer-goport<0.4135.70>:goport:handle_eof:582]Stream 'stderr' closed
       
      6703-[ns_server:debug,2022-04-03T06:02:54.942Z,babysitter_of_ns_1@cb.local:indexer-goport<0.4135.70>:goport:handle_eof:582]Stream 'stdout' closed
       
      6704-[ns_server:info,2022-04-03T06:02:54.942Z,babysitter_of_ns_1@cb.local:indexer-goport<0.4135.70>:goport:handle_process_exit:563]Port exited with status 0.
       
      6705-[ns_server:debug,2022-04-03T06:02:54.942Z,babysitter_of_ns_1@cb.local:<0.4129.70>:ns_port_server:terminate:199]indexer has exited
       
      --
      

      It would be good if we can add logging to goport code to precisely know where the issue is happening (or know that the issue isn't happening in goport and is elsewhere).

      Area's to investigate the original issue in CBSE-11735:

      1) I see exec package in golang mentioned that the Wait command (internally calls the wait syscall to notify when a child process spawned via exec.Start has terminated) waits till all the stdout and stderr buffers are completely drained and we have logic in goport which waits as well for eof's on stdout and stderr.

      Attachments

        For Gerrit Dashboard: MB-51813
        # Subject Branch Project Status CR V

        Activity

          People

            Abhijeeth.Nuthan Abhijeeth Nuthan
            hareen.kancharla Hareen Kancharla
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:

              Gerrit Reviews

                There is 1 open Gerrit change

                PagerDuty