How a Fire Nozzle Turned into an AI Data Asset
- TL;DR
- HEN Technologies redesigned a fire nozzle using computational fluid dynamics (computer simulations of how liquids and gases move), then added sensors and edge compute to turn firefighting hardware into a source of high-fidelity physics data for AI and robotics (company-reported).
- The company claims major suppression and water savings (company-reported), has grown rapidly, closed a $20M Series A, and is positioning its data lake as a product for predictive analytics and world-model training (company-reported).
- For AI teams and public-safety leaders: pilot the hardware-plus-data feed, negotiate clear data governance, and treat procurement and integration as first-order risks.
Why a nozzle matters to AI and business leaders
Most AI programs chase clickstreams, logs, or web text. The real bottleneck for AI agents and robotics that operate in messy physical environments is high-quality, real-world physics data. A better nozzle sounds small until you realize every firefighting incident is a complicated physics experiment—flow rates, pressures, droplet sizes, wind, hydrant availability, pump coordination and human tactics all interact in ways simulation struggles to reproduce.
HEN Technologies’ bet is straightforward: design a superior physical tool, instrument it, and sell both hardware and the resulting operational data to the organizations that need predictive analytics and physics-aware models. That’s a hardware-first AI data strategy, and it matters for any business building AI for the physical world.
The engineering win: smarter droplet physics
HEN started with an engineering problem—make suppression faster and reduce water use. Using NSF-funded computational fluid dynamics (CFD) research, the team optimized droplet size, velocity and wind resistance. The company reports increases in suppression rates of up to ~300% and roughly 67% water savings (company-reported). Those figures come from the company’s lab and field testing; independent verification would be the next step for procurement teams.
Put plainly: faster suppression reduces time on scene, lowers water damage and can cut chances of re-ignition. For cash-strapped municipalities and military bases, lower water use and faster knockdown translate into operational and budgetary value, not just neat engineering metrics.
Instrumentation and edge compute: the nozzle as a data node
The nozzle is the field “muscle,” but the meaningful value is the system of connected devices and the data they produce.
“The nozzle is the field ‘muscle,’ but the meaningful value is the system of connected devices and the data they produce.”
HEN’s products include nozzles, monitors, valves, overhead sprinklers, pressure devices and a flow-control product called Stream IQ. The line uses about 23 custom circuit-board designs and, in some devices, Nvidia Orion Nano processors—small computers built into the devices that do initial processing (edge compute) before sending telemetry to the cloud.
Built-in sensors record flow, pressure, water used per incident, hydrant activity, GPS location and weather. That creates multiple data types (pressure, flow, GPS, weather, video, etc.) from real incidents—data that AI agents and world models need but rarely get at scale.
“Predictive analytics are useless without high-quality data, and you can’t get that data without the right hardware.”
Traction and commercial evidence
Commercial adoption matters when a company’s moat depends on deployed sensors and long-term data collection. HEN reports roughly 1,500 fire-department customers, distribution through about 120 dealers, and shipments to 22 countries (company-reported). Institutional customers include the Marine Corps, U.S. Army bases, Naval labs, NASA and Abu Dhabi Civil Defense (company-reported).
Revenue growth is company-reported: first product revenue roughly $200K in Q2 2023, about $1.6M in 2024, $5.2M last year, and a projection of around $20M this year (company-reported). The company raised a $20M Series A led by O’Neil Strategic Capital plus $2M in venture debt from Silicon Valley Bank, bringing total funding above $30M (company-reported). HEN has filed about 20 patent applications (roughly six granted so far) and achieved GSA qualification after a year-long vetting—an important credential for government sales (company-reported).
Those signals—revenue trajectory, patents, GSA qualification and institutional customers—are the practical glue that can let a hardware-first data play scale. But public procurement is slow and political; selling hardware is one challenge, turning that installed base into paid analytics is another.
How the data becomes AI value: concrete use cases
High-fidelity incident telemetry unlocks several immediate AI applications:
- World-model training: Physics-aware models that simulate how water droplets, pressure waves and ventilation affect fire spread—useful for robotics and tactics planning.
- Anomaly detection: Real-time alerts for pump or hose failures based on deviations from expected pressure/flow signatures.
- Reinforcement learning for autonomy: Training policies for autonomous firefighting drones or robotic nozzles to optimize stream angle and pressure under varying wind and structure conditions.
- Predictive logistics: Forecasting water resupply needs, pump staging, and mutual-aid recommendations to reduce downtime during large incidents.
- Post-incident analytics: Forensics and continuous improvement—what tactics and nozzle settings led to fastest knockdown with least water?
These are not hypothetical. Companies building AI agents that interact with fluids or real-world forces struggle to get labeled, multimodal datasets. HEN’s recordings are the sort of rare, messy, multi-sensor signal that closes the sim-to-real gap for physical AI agents.
Business model and commercialization paths
There are several monetization levers:
- Hardware sales: GSA and distributor channels for nozzles and control gear—one-time or replacement-cycle revenue.
- Subscription analytics: Per-department SaaS for incident dashboards, predictive alerts and performance benchmarking.
- Data licensing: Aggregated or anonymized datasets sold to robotics companies, academic labs, or AI startups training world models.
- OEM partnerships: Integrations with pump makers, apparatus OEMs, or software vendors (dispatch/records) for embedded data services.
For public-safety buyers, hybrid models are sensible: core hardware sold under GSA or standard procurement, with basic analytics included and premium data or model access as optional paid tiers. This reduces sticker shock while creating a path to recurring revenue with higher margins than hardware alone.
Risks, governance and practical mitigations
Turning operational telemetry into a commercial asset brings policy, legal and integration risks.
- Data ownership and public records: Incident telemetry may be subject to public records or FOIA requests. Contracts must specify ownership, retention, and anonymization rules.
- Procurement constraints: Cash-strapped municipalities expect low-cost baseline tools. Design contracts that provide essential functionality while gating advanced analytics behind optional agreements.
- Interoperability: Value is limited if the platform is siloed. Open APIs and partnerships with major public-safety software vendors improve adoption.
- Security and liability: Live telemetry from active incidents raises liability concerns. Encryption, access controls, SLAs, and compliance audits are non-negotiable.
- Generalizability: AI models trained on one region’s fires may not transfer cleanly to wildland, urban, coastal or high-wind environments. Domain-adaptation and diverse deployments mitigate this.
Practical mitigations
- Negotiate clear data governance clauses — who can use the data, for what purposes, and how long it is retained.
- Offer baseline analytics free or low-cost for agencies and charge for advanced analytics or raw dataset access to commercial partners.
- Publish APIs and integration guides; invest in partnerships with dispatch and records vendors to avoid workflow friction.
- Implement and certify security controls and provide independent audits for customers to reduce procurement friction.
Competitive landscape and moat
Incumbent hardware vendors (e.g., IDEX Corp) and software providers (Central Square, First Due) are active in adjacent categories. HEN’s unique angle is combining specialized nozzle engineering, embedded compute, and an emerging data lake.
Replication is possible but costly. Time in the field, distribution relationships, patents, and the installed sensor base create friction for late entrants. That said, large incumbents could accelerate replication by acquiring sensor-savvy startups or licensing similar tech. The real durable asset is the dataset tied to long-term deployments across diverse incident types.
What AI and robotics teams should ask for
Three technical questions to bring to any pilot or procurement:
- Data format & access:
What schemas, sampling rates, and export APIs are available? Do you get raw telemetry or only pre-aggregated metrics?
- Labeling and context:
Are incident labels (structure type, tactics used, outcome) captured and accessible? How are edge-processed events synchronized with timestamps?
- Continuity & scale:
What SLAs exist for data delivery and retention? How many devices / incidents are available across seasons and geographies for model training?
For procurement and legal teams
Checklist items to include in contracts:
- Explicit data ownership and licensing terms; options to revoke commercial use if needed.
- Retention limits and anonymization standards to address FOIA/public records concerns.
- Security certifications, audit rights and breach notification SLAs.
- Interoperability commitments (APIs, data exports) and integration timelines.
Customer vignette (company-reported)
One Marine Corps base that piloted Stream IQ reported faster pressure stabilization during pump handoffs and a measurable reduction in water consumption during training scenarios, giving leaders confidence to expand deployments (company-reported).
Final takeaways for leaders
- Hardware can be a data strategy. A small, practical product improvement—better droplet physics and embedded sensors—can seed a data asset that powers AI for physical systems.
- Winning firefighters’ hearts is necessary; surviving procurement and compliance is the hard part. Design deals that separate essential hardware from premium analytics to ease adoption.
- For AI teams building world models or autonomous agents, partnerships with hardware-first companies offer the quickest route to real incident datasets that improve sim-to-real transfer.
“Building a product that frontline users love is one thing; navigating government procurement and getting institutional buy-in is the tougher part.”
“Companies developing robotics and predictive physics engines would pay well for the type of real-world data HEN is collecting.”
Next steps
If you run robotics, emergency operations, or predictive analytics teams: start a small pilot focused on a single station or specialty engine. Negotiate clear data governance up front, request raw telemetry exports and labeling context, and plan integration work with your dispatch and records vendors. That approach lets you test the model and prove value before scaling procurement.
For C-suite leaders evaluating AI for business and AI automation in physical domains, note the lesson here: durable AI advantage often starts with better sensors and field deployments—tools that convert routine operations into high-quality, hard-to-replicate datasets.