Skip to content

Anchor Tabular beam size not used #87

@smwrig

Description

@smwrig

Hi, it seems the beam_size parameter of AnchorTabularExplainer.explain_instance doesn't work as intended. It doesn't get passed to the anchor_beam method, which defaults to 1 and limits the search.

def explain_instance(self, data_row, classifier_fn, threshold=0.95,
delta=0.1, tau=0.15, batch_size=100,
max_anchor_size=None,
desired_label=None,
beam_size=4, **kwargs):
# It's possible to pass in max_anchor_size
sample_fn, mapping = self.get_sample_fn(
data_row, classifier_fn, desired_label=desired_label)
# return sample_fn, mapping
exp = anchor_base.AnchorBaseBeam.anchor_beam(
sample_fn, delta=delta, epsilon=tau, batch_size=batch_size,
desired_confidence=threshold, max_anchor_size=max_anchor_size,
**kwargs)

def anchor_beam(sample_fn, delta=0.05, epsilon=0.1, batch_size=10,
min_shared_samples=0, desired_confidence=1, beam_size=1,
verbose=False, epsilon_stop=0.05, min_samples_start=0,
max_anchor_size=None, verbose_every=1,
stop_on_first=False, coverage_samples=10000):

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions