smartbuildsim.models.anomaly labels anomalous sensor readings with
IsolationForest, a teraz również wspiera porównania z LocalOutlierFactor
przez smartbuildsim.evaluation.benchmark.run_anomaly_benchmark.
AnomalyDetectionConfig fields:
sensor: sensor name to monitor.contamination: expected proportion of anomalies (passed to
IsolationForest).random_state: ensures deterministic model output via
smartbuildsim.config.rolling_window: forwarded to FeatureConfig to control rolling statistics.Access the derived FeatureConfig via the feature_config property when calling
smartbuildsim.features.engineering.engineer_features directly.
detect_anomalies(data, config) performs the following steps:
IsolationForest with the engineered features.anomaly_score and is_anomaly columns to the returned dataframe.smartbuildsim model anomalies examples/configs/default.yaml
The command writes outputs/anomalies.csv, which is then consumed by
smartbuildsim viz plot for annotated visualisations.
from smartbuildsim.data.generator import DataGeneratorConfig, generate_dataset
from smartbuildsim.models.anomaly import (
AnomalyDetectionConfig,
detect_anomalies,
)
from smartbuildsim.scenarios.presets import get_scenario
scenario = get_scenario("office-small")
dataset = generate_dataset(
scenario.building, DataGeneratorConfig(**scenario.data.dict())
)
config = AnomalyDetectionConfig(**scenario.anomaly.dict())
result = detect_anomalies(dataset, config)
flagged = result.data[result.data["is_anomaly"]]
print(flagged[["timestamp", "value", "anomaly_score"]].head())
See examples/scripts/run_example.py for the full pipeline that chains anomaly
labelling with clustering and plotting. Porównawcze eksperymenty (krzyżowa
walidacja, testy istotności i analiza skalowania) są dostępne w
examples/scripts/run_benchmarks.py.