pyhs3.distributions.HistFactoryDistChannel

class pyhs3.distributions.HistFactoryDistChannel(**data)[source]

HistFactory probability distribution for a single channel/region.

Implements binned statistical models consisting of histograms (step functions) with various modifiers as defined in the HS3 specification. Each HistFactoryDistChannel represents one independent measurement channel/region with its own observed data. Multiple channels can be combined in a workspace to form a complete HistFactory model.

The total likelihood consists of: 1. Main likelihood: Poisson likelihood for observed bin counts vs expected rates 2. Constraint likelihoods: Constraint terms for nuisance parameters

The prediction for a binned region is given as:

\[\lambda(x) = \sum_{s \in \text{samples}} \left[ \left( d_s(x) + \sum_{\delta \in M_\delta} \delta(x,\theta_\delta) \right) \prod_{\kappa \in M_\kappa} \kappa(x,\theta_\kappa) \right]\]
where:
  • \(d_s(x)\) is the nominal prediction for sample \(s\)

  • \(M_\delta\) are additive modifiers (histosys)

  • \(M_\kappa\) are multiplicative modifiers (normfactor, normsys, shapefactor, etc.)

Observed Data Convention:

Observed data must be provided in the context as {name}_observed where name is the HistFactory distribution name. This is required for likelihood evaluation.

Constraint Types:
  • Gaussian constraints (default): histosys, normsys, staterror

  • Poisson constraints (default): shapesys

  • All constraint types can be overridden via the constraint field

Parameters:
  • axes (list) – Array of axis definitions with binning information

  • samples (list) – Array of sample definitions with data and modifiers

Supported Modifiers:
  • normfactor: Multiplicative scaling by parameter value

  • normsys: Multiplicative systematic with hi/lo interpolation

  • histosys: Additive correlated shape systematic

  • shapefactor: Uncorrelated multiplicative bin-by-bin scaling

  • shapesys: Uncorrelated shape systematic with Poisson constraints

  • staterror: Statistical uncertainty via Barlow-Beeston method

Modifier Naming in Dependency Graph:

Modifiers have simple names (e.g., “lumi”) in the HS3 specification, but are given unique identifiers in the dependency graph by prepending the full context: {dist_name}/{sample_name}/{modifier_type}/{modifier_name}

This design distinguishes individual modifier instances while allowing parameters to indicate correlation - modifiers sharing the same parameter name are correlated.

Example: Two modifiers both named “lumi” in different samples will have unique graph nodes like “SR/signal/normsys/lumi” and “CR/background/normsys/lumi”, but if they share the same parameter name, they are correlated.

HS3 Reference

:hs3:label:`histfactory_dist <hs3.histfactory-distribution>`

Parameters:

data (Any)

__init__(**data)

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

Methods

__init__(**data)

Create a new model by parsing and validating input data from keyword arguments.

construct([_fields_set])

copy(*[, include, exclude, update, deep])

Returns a copy of the model.

dict(*[, include, exclude, by_alias, ...])

expression(context)

Evaluate and return a named PyTensor expression.

extended_likelihood(context[, _data])

Build constraint model for nuisance parameters.

from_orm(obj)

get_internal_nodes()

Return all internal nodes that need to be in the dependency graph.

get_parameter_list(context, param_key)

Reconstruct a parameter list from flattened indexed keys.

json(*[, include, exclude, by_alias, ...])

likelihood(context)

Build the HistFactory main Poisson likelihood.

log_expression(context)

Log-probability combining main likelihood with extended terms.

model_construct([_fields_set])

Creates a new instance of the Model class with validated data.

model_copy(*[, update, deep])

!!! abstract "Usage Documentation"

model_dump(*[, mode, include, exclude, ...])

!!! abstract "Usage Documentation"

model_dump_json(*[, indent, ensure_ascii, ...])

!!! abstract "Usage Documentation"

model_json_schema([by_alias, ref_template, ...])

Generates a JSON schema for a model class.

model_parametrized_name(params)

Compute the class name for parametrizations of generic classes.

model_post_init(context, /)

This function is meant to behave like a BaseModel method to initialise private attributes.

model_rebuild(*[, force, raise_errors, ...])

Try to rebuild the pydantic-core schema for the model.

model_validate(obj, *[, strict, extra, ...])

Validate a pydantic model instance.

model_validate_json(json_data, *[, strict, ...])

!!! abstract "Usage Documentation"

model_validate_strings(obj, *[, strict, ...])

Validate the given object with string data against the Pydantic model.

parse_file(path, *[, content_type, ...])

parse_obj(obj)

parse_raw(b, *[, content_type, encoding, ...])

process_parameter(param_key)

Process a single parameter that can be either a string reference or numeric value.

process_parameter_list(param_key)

Process a list parameter containing mixed string references and numeric values.

schema([by_alias, ref_template])

schema_json(*[, by_alias, ref_template])

update_forward_refs(**localns)

validate(value)

Attributes

constants

Dictionary of PyTensor constants generated from numeric field values.

model_computed_fields

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

model_extra

Get extra fields set during validation.

model_fields

model_fields_set

Returns the set of fields that have been explicitly set on this model instance.

parameters

Return all parameters used by this HistFactory distribution.

type

axes

samples

name