Uploaded image for project: 'Couchbase Server'
  1. Couchbase Server
  2. MB-11030

Rebalance on buildbot exited with reason {{bulk_set_vbucket_state_failed

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Test Blocker
    • 3.0
    • 3.0
    • ns_server
    • Security Level: Public
    • None
    • Untriaged
    • Unknown

    Description

      last few runs buildbot failed on rebalance (test test_employee_dataset_startkey_endkey_queries_rebalance_in (view.viewquerytests.ViewQueryTests))

      http://factory.couchbase.com/job/testrunner-gerrit-master/1255/console

      https://s3.amazonaws.com/bugdb/jira/MB-11030/c7344960/cluster_run.log

      [ns_server:error,2014-05-02T23:55:49.082,n_0@10.3.2.50:<0.10104.0>:misc:sync_shutdown_many_i_am_trapping_exits:1473]Shutdown of the following failed: [{<0.10115.0>,
      {{badmatch,[

      {<18913.3672.0>,noproc}]},
      [{misc,
      sync_shutdown_many_i_am_trapping_exits,
      1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}}]
      [couchdb:info,2014-05-02T23:55:49.085,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 44
      [couchdb:info,2014-05-02T23:55:49.086,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 43
      [ns_server:error,2014-05-02T23:55:49.082,n_0@10.3.2.50:<0.10104.0>:misc:try_with_maybe_ignorant_after:1509]Eating exception from ignorant after-block:
      {error,{badmatch,[{<0.10115.0>,
      {{badmatch,[{<18913.3672.0>,noproc}

      ]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [

      {file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"}

      ,

      {line,1504}]},
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1507}]
      },
      {ns_single_vbucket_mover,mover,6,
      [{file,"src/ns_single_vbucket_mover.erl"},
      {line,75}]},
      {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,227}]
      }]}
      [ns_server:debug,2014-05-02T23:55:49.077,n_0@10.3.2.50:<0.10430.0>:misc:inner_wait_shutdown:1492]Here's messages:
      {messages,[{'EXIT',<0.10421.0>,shutdown}]}
      [couchdb:info,2014-05-02T23:55:49.088,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 46
      [error_logger:error,2014-05-02T23:55:49.083,n_0@10.3.2.50:error_logger<0.6.0>:ale_error_logger_handler:do_log:207]** Generic server <0.10216.0> terminating
      ** Last message in was {'EXIT',<18913.3768.0>,downstream_closed}
      ** When Server state == {state,"default",47,'n_0@10.3.2.50',
      [{'n_1@127.0.0.1',<18913.3768.0>}]}
      ** Reason for termination ==
      ** {{badmatch,[{<18913.3768.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}

      ]},
      {gen_server,terminate,6,[

      {file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,227}]}]}

      [ns_server:error,2014-05-02T23:55:49.084,n_0@10.3.2.50:<0.10205.0>:misc:sync_shutdown_many_i_am_trapping_exits:1473]Shutdown of the following failed: [{<0.10216.0>,
      {{badmatch,[{<18913.3768.0>,noproc}]},
      [{misc,
      sync_shutdown_many_i_am_trapping_exits,
      1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"}

      ,

      {line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}}]
      [ns_server:info,2014-05-02T23:55:49.089,n_0@10.3.2.50:<0.10074.0>:ns_replicas_builder_utils:kill_a_bunch_of_tap_names:59]Killed the following tap names on 'n_0@10.3.2.50': [<<"replication_building_53_'n_1@127.0.0.1'">>]
      [ns_server:error,2014-05-02T23:55:49.089,n_0@10.3.2.50:<0.10430.0>:misc:sync_shutdown_many_i_am_trapping_exits:1473]Shutdown of the following failed: [{<18913.3991.0>,noproc}]
      [ns_server:error,2014-05-02T23:55:49.090,n_0@10.3.2.50:<0.10205.0>:misc:try_with_maybe_ignorant_after:1509]Eating exception from ignorant after-block:
      {error,{badmatch,[{<0.10216.0>,
      {{badmatch,[{<18913.3768.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}

      ]},
      {proc_lib,init_p_do_apply,3,
      [

      {file,"proc_lib.erl"},{line,227}]}]}}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1507}]
      },
      {ns_single_vbucket_mover,mover,6,
      [{file,"src/ns_single_vbucket_mover.erl"},
      {line,75}]},
      {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"}

      ,

      {line,227}]}]}
      [ns_server:error,2014-05-02T23:55:49.091,n_0@10.3.2.50:<0.10061.0>:misc:sync_shutdown_many_i_am_trapping_exits:1473]Shutdown of the following failed: [{<0.10074.0>,
      {{badmatch,[{<18913.3629.0>,noproc}]},
      [{misc,
      sync_shutdown_many_i_am_trapping_exits,
      1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}

      ]}]}}]
      [couchdb:info,2014-05-02T23:55:49.092,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 44
      [error_logger:error,2014-05-02T23:55:49.091,n_0@10.3.2.50:error_logger<0.6.0>:ale_error_logger_handler:do_log:207]
      =========================CRASH REPORT=========================
      crasher:
      initial call: new_ns_replicas_builder:init/1
      pid: <0.10216.0>
      registered_name: []
      exception exit: {{badmatch,[

      {<18913.3768.0>,noproc}

      ]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [

      {file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"}

      ,

      {line,1504}]},
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}
      in function gen_server:terminate/6 (gen_server.erl, line 715)
      ancestors: [<0.10205.0>,<0.9817.0>,<0.9773.0>,<0.1183.0>,mb_master_sup,
      mb_master,ns_server_sup,'ns_server_sup-wrapper',
      ns_server_cluster_sup,<0.58.0>]
      messages: [{'EXIT',<0.10205.0>,shutdown}]
      links: [<0.10205.0>]
      dictionary: []
      trap_exit: true
      status: running
      heap_size: 46368
      stack_size: 24
      reductions: 10511
      neighbours:

      [couchdb:info,2014-05-02T23:55:49.093,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 47
      [ns_server:error,2014-05-02T23:55:49.092,n_0@10.3.2.50:<0.10061.0>:misc:try_with_maybe_ignorant_after:1509]Eating exception from ignorant after-block:
      {error,{badmatch,[{<0.10074.0>,
      {{badmatch,[{<18913.3629.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}

      ]},
      {gen_server,terminate,6,
      [

      {file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1507}]
      },
      {ns_single_vbucket_mover,mover,6,
      [{file,"src/ns_single_vbucket_mover.erl"},
      {line,75}]},
      {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,227}]
      }]}
      [error_logger:error,2014-05-02T23:55:49.093,n_0@10.3.2.50:error_logger<0.6.0>:ale_error_logger_handler:do_log:207]** Generic server <0.10115.0> terminating
      ** Last message in was {'EXIT',<18913.3672.0>,downstream_closed}
      ** When Server state == {state,"default",51,'n_0@10.3.2.50',
      [{'n_1@127.0.0.1',<18913.3672.0>}]}
      ** Reason for termination ==
      ** {{badmatch,[{<18913.3672.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,[{file,"gen_server.erl"}

      ,

      {line,712}]},
      {proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,227}]}]}

      [ns_server:info,2014-05-02T23:55:49.095,n_0@10.3.2.50:<0.10430.0>:ns_replicas_builder_utils:kill_a_bunch_of_tap_names:59]Killed the following tap names on 'n_0@10.3.2.50': [<<"replication_building_38_'n_1@127.0.0.1'">>]
      [ns_server:error,2014-05-02T23:55:49.095,n_0@10.3.2.50:<0.10421.0>:misc:sync_shutdown_many_i_am_trapping_exits:1473]Shutdown of the following failed: [{<0.10430.0>,
      {{badmatch,[{<18913.3991.0>,noproc}]},
      [{misc,
      sync_shutdown_many_i_am_trapping_exits,
      1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}

      ]},
      {proc_lib,init_p_do_apply,3,
      [

      {file,"proc_lib.erl"},{line,227}]}]}}]
      [error_logger:error,2014-05-02T23:55:49.095,n_0@10.3.2.50:error_logger<0.6.0>:ale_error_logger_handler:do_log:207]
      =========================CRASH REPORT=========================
      crasher:
      initial call: new_ns_replicas_builder:init/1
      pid: <0.10115.0>
      registered_name: []
      exception exit: {{badmatch,[{<18913.3672.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"}

      ,

      {line,227}]}]}
      in function gen_server:terminate/6 (gen_server.erl, line 715)
      ancestors: [<0.10104.0>,<0.9817.0>,<0.9773.0>,<0.1183.0>,mb_master_sup,
      mb_master,ns_server_sup,'ns_server_sup-wrapper',
      ns_server_cluster_sup,<0.58.0>]
      messages: [{'EXIT',<0.10104.0>,shutdown}]
      links: [<0.10104.0>]
      dictionary: []
      trap_exit: true
      status: running
      heap_size: 46368
      stack_size: 24
      reductions: 10523
      neighbours:

      [ns_server:error,2014-05-02T23:55:49.096,n_0@10.3.2.50:<0.10421.0>:misc:try_with_maybe_ignorant_after:1509]Eating exception from ignorant after-block:
      {error,{badmatch,[{<0.10430.0>,
      {{badmatch,[{<18913.3991.0>,noproc}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [{file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"},{line,1504}]
      },
      {gen_server,terminate,6,
      [{file,"gen_server.erl"},{line,712}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}

      ]}]}}]},
      [{misc,sync_shutdown_many_i_am_trapping_exits,1,
      [

      {file,"src/misc.erl"},{line,1475}]},
      {misc,try_with_maybe_ignorant_after,2,
      [{file,"src/misc.erl"}

      ,

      {line,1507}

      ]},
      {ns_single_vbucket_mover,mover,6,
      [

      {file,"src/ns_single_vbucket_mover.erl"}

      ,

      {line,75}

      ]},
      {proc_lib,init_p_do_apply,3,[

      {file,"proc_lib.erl"},{line,227}]}]}
      [couchdb:info,2014-05-02T23:55:49.097,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 46
      [couchdb:info,2014-05-02T23:55:49.097,n_1@127.0.0.1:<0.5140.0>:couch_log:info:39]127.0.0.1 - - GET /_active_tasks 200
      [ns_server:debug,2014-05-02T23:55:49.099,n_0@10.3.2.50:compaction_daemon<0.505.0>:compaction_daemon:handle_info:456]Looks like vbucket mover inhibiting view compaction for for bucket "default" is dead. Canceling inhibition
      [ns_server:debug,2014-05-02T23:55:49.101,n_1@127.0.0.1:compaction_daemon<0.3078.0>:compaction_daemon:handle_info:456]Looks like vbucket mover inhibiting view compaction for for bucket "default" is dead. Canceling inhibition
      [couchdb:info,2014-05-02T23:55:49.101,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 48
      [couchdb:info,2014-05-02T23:55:49.101,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 47
      [couchdb:info,2014-05-02T23:55:49.106,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 48
      [couchdb:info,2014-05-02T23:55:49.110,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 51
      [ns_server:debug,2014-05-02T23:55:49.098,n_0@10.3.2.50:<0.9821.0>:ns_pubsub:do_subscribe_link:136]Parent process of subscription {ns_node_disco_events,<0.9817.0>} exited with reason {{bulk_set_vbucket_state_failed,
      [{'n_0@10.3.2.50',
      {'EXIT',
      {{{{{unexpected_reason,
      {badmatch,
      {error, closed}}},
      [{misc,
      executing_on_new_process,
      1,
      [{file, "src/misc.erl"},
      {line, 1455}]},
      {ns_vbm_sup,
      perform_vbucket_filter_change,
      6,
      [{file, "src/ns_vbm_sup.erl"},
      {line, 100}]},
      {tap_replication_manager,
      change_vbucket_filter,
      4,
      [{file, "src/tap_replication_manager.erl"},
      {line, 181}]},
      {tap_replication_manager,
      'set_incoming_replication_map/2-lc$^2/1-2',
      2,
      [{file, "src/tap_replication_manager.erl"},
      {line, 87}]},
      {tap_replication_manager,
      set_incoming_replication_map,
      2,
      [{file, "src/tap_replication_manager.erl"},
      {line, 87}]},
      {tap_replication_manager,
      handle_call,
      3,
      [{file, "src/tap_replication_manager.erl"},
      {line, 110}]},
      {gen_server,
      handle_msg,
      5,
      [{file, "gen_server.erl"},
      {line, 578}]},
      {proc_lib,
      init_p_do_apply,
      3,
      [{file, "proc_lib.erl"},
      {line, 227}]}]},
      {gen_server,
      call,
      ['tap_replication_manager-default',
      {set_desired_replications,
      [{'n_1@127.0.0.1', "-12;<=>?"}]},
      infinity]}},
      {gen_server,
      call,
      ['replication_manager-default',
      {change_vbucket_replication, 45, 'n_1@127.0.0.1'},
      infinity]}},
      {gen_server,
      call,
      [{'janitor_agent-default', 'n_0@10.3.2.50'},
      {if_rebalance,
      <0.9817.0>,
      {update_vbucket_state,
      45,
      replica,
      undefined,
      'n_1@127.0.0.1'}},
      infinity]}}}}]},
      [{janitor_agent,
      bulk_set_vbucket_state,
      4,
      [{file, "src/janitor_agent.erl"},
      {line, 383}]},
      {ns_vbucket_mover,
      update_replication_post_move,
      3,
      [{file, "src/ns_vbucket_mover.erl"},
      {line, 380}]},
      {ns_vbucket_mover,
      on_move_done,
      2,
      [{file, "src/ns_vbucket_mover.erl"},
      {line, 244}]},
      {gen_server,
      handle_msg,
      5,
      [{file, "gen_server.erl"},
      {line, 597}]},
      {proc_lib,
      init_p_do_apply,
      3,
      [{file, "proc_lib.erl"},
      {line, 227}]}]}
      [couchdb:info,2014-05-02T23:55:49.115,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 52
      [couchdb:info,2014-05-02T23:55:49.117,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 51
      [ns_server:debug,2014-05-02T23:55:49.098,n_0@10.3.2.50:<0.9788.0>:ns_pubsub:do_subscribe_link:136]Parent process of subscription {master_activity_events,<0.9787.0>} exited with reason {{bulk_set_vbucket_state_failed,
      [{'n_0@10.3.2.50',
      {'EXIT',
      {{{{{unexpected_reason,
      {badmatch,
      {error, closed}}},
      [{misc,
      executing_on_new_process,
      1,
      [{file, "src/misc.erl"},
      {line, 1455}]},
      {ns_vbm_sup,
      perform_vbucket_filter_change,
      6,
      [{file, "src/ns_vbm_sup.erl"},
      {line, 100}]},
      {tap_replication_manager,
      change_vbucket_filter,
      4,
      [{file, "src/tap_replication_manager.erl"},
      {line, 181}]},
      {tap_replication_manager,
      'set_incoming_replication_map/2-lc$^2/1-2',
      2,
      [{file, "src/tap_replication_manager.erl"},
      {line, 87}]},
      {tap_replication_manager,
      set_incoming_replication_map,
      2,
      [{file, "src/tap_replication_manager.erl"},
      {line, 87}]},
      {tap_replication_manager,
      handle_call,
      3,
      [{file, "src/tap_replication_manager.erl"},
      {line, 110}]},
      {gen_server,
      handle_msg,
      5,
      [{file, "gen_server.erl"},
      {line, 578}]},
      {proc_lib,
      init_p_do_apply,
      3,
      [{file, "proc_lib.erl"},
      {line, 227}]}]},
      {gen_server,
      call,
      ['tap_replication_manager-default',
      {set_desired_replications,
      [{'n_1@127.0.0.1', "-12;<=>?"}]},
      infinity]}},
      {gen_server,
      call,
      ['replication_manager-default',
      {change_vbucket_replication, 45, 'n_1@127.0.0.1'},
      infinity]}},
      {gen_server,
      call,
      [{'janitor_agent-default', 'n_0@10.3.2.50'},
      {if_rebalance,
      <0.9817.0>,
      {update_vbucket_state,
      45,
      replica,
      undefined,
      'n_1@127.0.0.1'}},
      infinity]}}}}]},
      [{janitor_agent,
      bulk_set_vbucket_state,
      4,
      [{file, "src/janitor_agent.erl"},
      {line, 383}]},
      {ns_vbucket_mover,
      update_replication_post_move,
      3,
      [{file, "src/ns_vbucket_mover.erl"},
      {line, 380}]},
      {ns_vbucket_mover,
      on_move_done,
      2,
      [{file, "src/ns_vbucket_mover.erl"},
      {line, 244}]},
      {gen_server,
      handle_msg,
      5,
      [{file, "gen_server.erl"},
      {line, 597}]},
      {proc_lib,
      init_p_do_apply,
      3,
      [{file, "proc_lib.erl"},
      {line, 227}]}]}
      [couchdb:info,2014-05-02T23:55:49.122,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 53
      [couchdb:info,2014-05-02T23:55:49.122,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 52
      [user:info,2014-05-02T23:55:49.099,n_0@10.3.2.50:<0.1183.0>:ns_orchestrator:handle_info:480]Rebalance exited with reason {{bulk_set_vbucket_state_failed,
      [{'n_0@10.3.2.50',
      {'EXIT',
      {{{{{unexpected_reason,
      {badmatch,{error,closed}}},
      [{misc,executing_on_new_process,1,
      [{file,"src/misc.erl"},{line,1455}]},
      {ns_vbm_sup,
      perform_vbucket_filter_change,6,
      [{file,"src/ns_vbm_sup.erl"},
      {line,100}]},
      {tap_replication_manager,
      change_vbucket_filter,4,
      [{file, "src/tap_replication_manager.erl"},
      {line,181}]},
      {tap_replication_manager,
      'set_incoming_replication_map/2-lc$^2/1-2',
      2,
      [{file, "src/tap_replication_manager.erl"},
      {line,87}]},
      {tap_replication_manager,
      set_incoming_replication_map,2,
      [{file, "src/tap_replication_manager.erl"},
      {line,87}]},
      {tap_replication_manager,handle_call,
      3,
      [{file, "src/tap_replication_manager.erl"},
      {line,110}]},
      {gen_server,handle_msg,5,
      [{file,"gen_server.erl"},{line,578}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"}

      ,

      {line,227}]}]},
      {gen_server,call,
      ['tap_replication_manager-default',
      {set_desired_replications,
      [{'n_1@127.0.0.1',"-12;<=>?"}]},
      infinity]}},
      {gen_server,call,
      ['replication_manager-default',
      {change_vbucket_replication,45, 'n_1@127.0.0.1'},
      infinity]}},
      {gen_server,call,
      [{'janitor_agent-default','n_0@10.3.2.50'},
      {if_rebalance,<0.9817.0>,
      {update_vbucket_state,45,replica,
      undefined,'n_1@127.0.0.1'}},
      infinity]}}}}]},
      [{janitor_agent,bulk_set_vbucket_state,4,
      [{file,"src/janitor_agent.erl"},{line,383}]},
      {ns_vbucket_mover,
      update_replication_post_move,3,
      [{file,"src/ns_vbucket_mover.erl"},
      {line,380}]},
      {ns_vbucket_mover,on_move_done,2,
      [{file,"src/ns_vbucket_mover.erl"},
      {line,244}]},
      {gen_server,handle_msg,5,
      [{file,"gen_server.erl"},{line,597}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}

      ]}]}

      [ns_server:info,2014-05-02T23:55:49.125,n_1@127.0.0.1:<0.5305.0>:diag_handler:log_all_tap_and_checkpoint_stats:125]logging tap & checkpoint stats
      [couchdb:info,2014-05-02T23:55:49.127,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 54
      [couchdb:info,2014-05-02T23:55:49.131,n_1@127.0.0.1:<0.5302.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-0697cad`: received a snapshot marker for partition 55
      [couchdb:info,2014-05-02T23:55:49.131,n_1@127.0.0.1:<0.5295.0>:couch_log:info:39]set view `default`, main (prod) group `_design/test_view-65d3923`: received a snapshot marker for partition 53
      [error_logger:error,2014-05-02T23:55:49.106,n_0@10.3.2.50:error_logger<0.6.0>:ale_error_logger_handler:do_log:207]
      =========================CRASH REPORT=========================
      crasher:
      initial call: ns_single_vbucket_mover:mover/6
      pid: <0.10104.0>
      registered_name: []
      exception exit: {unexpected_exit,
      {'EXIT',<0.9817.0>,
      {{bulk_set_vbucket_state_failed,
      [{'n_0@10.3.2.50',
      {'EXIT',
      {{{{{unexpected_reason,{badmatch,

      {error,closed}}},
      [{misc,executing_on_new_process,1,
      [{file,"src/misc.erl"},{line,1455}]},
      {ns_vbm_sup,perform_vbucket_filter_change,6,
      [{file,"src/ns_vbm_sup.erl"},{line,100}]},
      {tap_replication_manager,
      change_vbucket_filter,4,
      [{file,"src/tap_replication_manager.erl"},
      {line,181}]},
      {tap_replication_manager,
      'set_incoming_replication_map/2-lc$^2/1-2',
      2,
      [{file,"src/tap_replication_manager.erl"},
      {line,87}]},
      {tap_replication_manager,
      set_incoming_replication_map,2,
      [{file,"src/tap_replication_manager.erl"},
      {line,87}]},
      {tap_replication_manager,handle_call,3,
      [{file,"src/tap_replication_manager.erl"},
      {line,110}]},
      {gen_server,handle_msg,5,
      [{file,"gen_server.erl"},{line,578}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]},
      {gen_server,call,
      ['tap_replication_manager-default',
      {set_desired_replications,
      [{'n_1@127.0.0.1',"-12;<=>?"}]},
      infinity]}},
      {gen_server,call,
      ['replication_manager-default',
      {change_vbucket_replication,45,'n_1@127.0.0.1'},
      infinity]}},
      {gen_server,call,
      [{'janitor_agent-default','n_0@10.3.2.50'},
      {if_rebalance,<0.9817.0>,
      {update_vbucket_state,45,replica,undefined,
      'n_1@127.0.0.1'}},
      infinity]}}}}]},
      [{janitor_agent,bulk_set_vbucket_state,4,
      [{file,"src/janitor_agent.erl"},{line,383}]},
      {ns_vbucket_mover,update_replication_post_move,3,
      [{file,"src/ns_vbucket_mover.erl"},{line,380}]},
      {ns_vbucket_mover,on_move_done,2,
      [{file,"src/ns_vbucket_mover.erl"},{line,244}]
      },
      {gen_server,handle_msg,5,
      [{file,"gen_server.erl"},{line,597}]
      },
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]}}}
      in function ns_single_vbucket_mover:spawn_and_wait/1 (src/ns_single_vbucket_mover.erl, line 108)
      in call from ns_single_vbucket_mover:wait_index_updated/5 (src/ns_single_vbucket_mover.erl, line 179)
      in call from ns_single_vbucket_mover:mover_inner/5 (src/ns_single_vbucket_mover.erl, line 425)
      in call from misc:try_with_maybe_ignorant_after/2 (src/misc.erl, line 1504)
      in call from ns_single_vbucket_mover:mover/6 (src/ns_single_vbucket_mover.erl, line 75)
      ancestors: [<0.9817.0>,<0.9773.0>,<0.1183.0>,mb_master_sup,mb_master,
      ns_server_sup,'ns_server_sup-wrapper',ns_server_cluster_sup,
      <0.58.0>]
      messages: [{'EXIT',<0.9817.0>,
      {{bulk_set_vbucket_state_failed,
      [{'n_0@10.3.2.50',
      {'EXIT',
      {{{{{unexpected_reason,{badmatch,{error,closed}

      }},
      [{misc,executing_on_new_process,1,
      [

      {file,"src/misc.erl"}

      ,

      {line,1455}

      ]},
      {ns_vbm_sup,perform_vbucket_filter_change,6,
      [

      {file,"src/ns_vbm_sup.erl"}

      ,

      {line,100}

      ]},
      {tap_replication_manager,change_vbucket_filter,
      4,
      [

      {file,"src/tap_replication_manager.erl"},
      {line,181}]},
      {tap_replication_manager,
      'set_incoming_replication_map/2-lc$^2/1-2',2,
      [{file,"src/tap_replication_manager.erl"}

      ,

      {line,87}]},
      {tap_replication_manager,
      set_incoming_replication_map,2,
      [{file,"src/tap_replication_manager.erl"},
      {line,87}

      ]},
      {tap_replication_manager,handle_call,3,
      [

      {file,"src/tap_replication_manager.erl"}

      ,

      {line,110}

      ]},
      {gen_server,handle_msg,5,
      [

      {file,"gen_server.erl"},{line,578}]},
      {proc_lib,init_p_do_apply,3,
      [{file,"proc_lib.erl"},{line,227}]}]},
      {gen_server,call,
      ['tap_replication_manager-default',
      {set_desired_replications,
      [{'n_1@127.0.0.1',"-12;<=>?"}]},
      infinity]}},
      {gen_server,call,
      ['replication_manager-default',
      {change_vbucket_replication,45,'n_1@127.0.0.1'},
      infinity]}},
      {gen_server,call,
      [{'janitor_agent-default','n_0@10.3.2.50'},
      {if_rebalance,<0.9817.0>,
      {update_vbucket_state,45,replica,undefined,
      'n_1@127.0.0.1'}},
      infinity]}}}}]},
      [{janitor_agent,bulk_set_vbucket_state,4,
      [{file,"src/janitor_agent.erl"},{line,383}]},
      {ns_vbucket_mover,update_replication_post_move,3,
      [{file,"src/ns_vbucket_mover.erl"},{line,380}]},
      {ns_vbucket_mover,on_move_done,2,
      [{file,"src/ns_vbucket_mover.erl"},{line,244}]
      },
      {gen_server,handle_msg,5,
      [{file,"gen_server.erl"}

      ,

      {line,597}

      ]},
      {proc_lib,init_p_do_apply,3,
      [

      {file,"proc_lib.erl"}

      ,

      {line,227}

      ]}]}}]
      links: [<0.9817.0>]
      dictionary: [

      {cleanup_list,[<0.10115.0>,<0.10171.0>]}

      ]
      trap_exit: true
      status: running
      heap_size: 6765
      stack_size: 24
      reductions: 17076
      neighbours:

      Attachments

        No reviews matched the request. Check your Options in the drop-down menu of this sections header.

        Activity

          People

            alkondratenko Aleksey Kondratenko (Inactive)
            andreibaranouski Andrei Baranouski
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Gerrit Reviews

                There are no open Gerrit changes

                PagerDuty