CLOUD_NATIVE_SAAS // INFRASTRUCTURE_ENGINEERING // CROSS_PLATFORM_DELIVERY // DATA_RESIDENCY_COMPLIANCE // AVAILABILITY_ZONE_REDUNDANCY // ENCRYPTION_AT_REST // IDENTITY_ACCESS_MANAGEMENT // SYS-STATE: FULL_PRODUCTION // OPERATIONAL_CONTINUITY

CLOUD_NATIVE_SAAS // INFRASTRUCTURE_ENGINEERING // CROSS_PLATFORM_DELIVERY // DATA_RESIDENCY_COMPLIANCE // AVAILABILITY_ZONE_REDUNDANCY // ENCRYPTION_AT_REST // IDENTITY_ACCESS_MANAGEMENT // SYS-STATE: FULL_PRODUCTION // OPERATIONAL_CONTINUITY

| Research & Analysis

Strategic Insights

Futuristic holographic display of a scientific or technological structure with blue energy beams and digital interface projections

Research, analysis, and technical perspective structured for consequential decisions across security, infrastructure, and institutional technology.

When the Model Lies: Observability, Risk & AI Transparency

When the Model Lies: Observability, Risk & AI Transparency

A Canadian traveller, Jake Moffatt, asked Air Canada’s website chatbot whether bereavement fares could be claimed after travel. The bot invented a 90-day refund window, Mr Moffatt bought a CA \$1600 ticket where he should’ve paid CA \$760, and the airline later refused to honour the promise. In February 2024 A civil tribunal ruled the answer “misleading” and ordered Air Canada to reimburse the fare, interest, and costs—more than CA \$812 in damages. One hallucination became a legal court case, caused reputational damage, and about CA \$1,000,000 in indirect costs. That story is no longer an outlier. LLM errors are creeping into contracts, trading systems, and operational dashboards. The common thread: a lack of deep observability.

Read More