Rural 5G Reality Check: 15‑Hour Test of AT&T, T‑Mobile & Verizon
TL;DR: Over a 15‑hour country‑road route through Iowa and southern Wisconsin, three Samsung Galaxy S26 Ultra phones running carrier eSIMs collected ~52,000 nPerf samples. T‑Mobile showed the most 5G presence and was the only carrier to record standalone 5G (5G SA). Verizon delivered stronger average signal levels (bars) and better dBm (“signal volume”), with AT&T generally between the two. For businesses operating off‑interstate, a visible 5G icon doesn’t guarantee usable throughput—prioritize signal strength, latency and uptime over marketing claims.
Methodology — how the test was run
- Route and duration: A ~15‑hour weekend trip on country roads through Douds, IA; Platteville and Janesville, WI; and back toward Chicagoland (no interstates).
- Devices: Three identical Samsung Galaxy S26 Ultra phones (one supplied by Samsung; the others provisioned with AT&T and T‑Mobile eSIMs).
- Power and mounting: Phones powered from an Anker Solix C1000 portable battery and held in a makeshift tripod/PVC rig.
- Measurement tool: nPerf — tests exported as a data dump yielding roughly 52,000 datapoints.
- Metrics captured: Network type (LTE, LTE‑Advanced, 5G NSA, 5G SA), phone network level (bars), raw signal strength (dBm), and throughput/latency where measurements were taken.
- Tethering note: An Oppo Find N6 was used for some tethering tests; it’s not optimized for US bands and could affect tether results.
- Limitations: Single route, one phone model, and time window. Bars are vendor‑specific; dBm is a more consistent cross‑device measure.
Quick glossary (plain language)
- dBm: Think of this as “signal volume”—numbers are negative; closer to zero = louder/stronger signal (for example, −70 dBm is stronger than −110 dBm).
- 5G NSA (non‑standalone): Like putting a new engine on an old chassis—some 5G benefits, but it still depends on 4G core systems.
- 5G SA (standalone): A native 5G core—fuller latency and feature benefits when coupled with good radio signal.
- nPerf: A network performance testing app that records throughput, latency and radio type over time; used here to generate the dataset.
Key findings
- T‑Mobile recorded the largest share of 5G readings and was the only carrier to register standalone 5G (5G SA) on this route. About 90% of T‑Mobile’s 5G samples showed SA.
- Verizon led on average network level (bars) and raw signal strength (dBm). It reported a “good” bar level roughly 44% of the time across the route.
- AT&T consistently fell between Verizon and T‑Mobile on both bars and dBm.
- Practical impact: two brief slowdowns and one short region of complete internet outage in southern Wisconsin — roughly 20 minutes of total downtime during the trip.
- Bottom line: 5G presence (the icon) and usable 5G connectivity (strong signal, low latency, steady throughput) are different metrics. Businesses should measure the latter when assessing risk.
Presence vs. usability — why the 5G icon can lie
Seeing “5G” next to your carrier name is useful for marketing but not always for operations. Two separate things matter:
- Presence: Does the device register 5G at all? We saw T‑Mobile dominate by this measure; it even showed standalone 5G in rural pockets.
- Usability: Is the signal strong and stable enough for your task? This depends on dBm, latency (ms), and sustained throughput (Mbps). Verizon typically delivered better usability metrics on this route.
Analogy: presence is the light on the dashboard; usability is whether the engine actually runs smoothly under load. 5G NSA can show a 5G badge while still relying on 4G systems—so latency improvements are limited. Only 5G SA with good dBm will reliably unlock lower latency and higher session consistency for advanced field uses.
What mattered most for field productivity
- Signal volume (dBm): The clearest cross‑device indicator for whether an application will perform. Rough guidance: better than −95 dBm is reasonable; better than −85 dBm is strong; worse than −105 dBm is marginal for many real‑time tasks.
- Latency: For VoIP/video support or AR guidance, target <100 ms; for remote control or low‑latency telemetry, aim for <50 ms.
- Throughput: Small telemetry is tolerant; video/remote support needs sustained 2–5+ Mbps upload and download depending on resolution and codec.
Business impact: five real-world use cases and what to watch for
- Farm IoT telemetry
What matters: steady uplink, moderate throughput, tolerant to some latency. A carrier with consistent dBm across fields matters more than a fleeting 5G SA icon.
- Mobile point‑of‑sale (retail pop‑ups)
What matters: reliability and low outage frequency. Even brief dropouts cost sales. Prioritize percent time at usable throughput and SLA language over marketing claims.
- Delivery/logistics routing and telematics
What matters: uptime and consistent small‑packet throughput. Hit rates for location and telemetry updates depend on consistent signal strength.
- Remote visual support (video troubleshooting)
What matters: latency and sustained upload bandwidth. 5G SA helps if the radio link (dBm) supports it; otherwise LTE with good dBm can outperform flaky 5G NSA.
- Edge compute or remote control for field equipment
What matters: low latency and ultra‑reliable connections. Only trustworthy dBm plus 5G SA (where available) closes the gap for these high‑risk applications.
Practical checklist: how to test rural cellular for your fleet
- Pick representative routes and times: Test the exact roads, times, and days your crews work (repeat across shifts/days).
- Use production devices: Test phones, tablets, and tethering hardware your teams actually use; different modems and firmware change results.
- Run nPerf or equivalent continuously: Record dBm, network type, throughput, latency, packet loss and timestamps. Aim for high sample counts (thousands of samples per route).
- Include tethering and peripherals: If teams tether devices or use external hotspots, include those in tests—tether device compatibility matters.
- Log outages and user impact: Note when an app fails (payment drop, failed upload) and correlate with dBm/latency at that time.
- Aggregate into usable KPIs: Percent time above usable thresholds (e.g., download >2 Mbps & latency <100 ms), mean outage duration, percent time at dBm better than −95 dBm.
- Repeat and compare: Run at least three repeats across different days/seasons for statistical confidence.
Suggested procurement language (SLA snippet)
Minimal example for RFPs and carrier discussions:
Carrier must provide average uplink/download throughput of ≥5 Mbps and median latency ≤100 ms for 95% of measurements across specified route. Maximum allowable cumulative outage: 120 minutes/month on listed routes. Carrier must provide prioritized troubleshooting within 4 business hours for outages affecting 10%+ of devices in a fleet.
Key questions answered
- Which carrier showed the most 5G presence on these rural roads?
T‑Mobile: it recorded the most 5G readings and was the only carrier to show standalone 5G (5G SA) on this route—about 90% of its 5G samples were SA.
- Which carrier delivered stronger, more usable signals overall?
Verizon: it led on average network level (bars) and signal strength (dBm), reporting “good” bar levels roughly 44% of the time on this trip.
- Was rural connectivity uniformly terrible?
No. There were two brief slowdowns and a single regional outage in southern Wisconsin (about 20 minutes), but most of the route remained usable most of the time.
- How should businesses evaluate connectivity risk?
Focus on usable metrics—dBm, latency, sustained throughput and historical outage patterns—tested on your actual devices and routes. Don’t rely on a 5G badge alone.
Limitations and what to watch for
- Single route and limited time window—results are illustrative, not definitive for all rural America.
- One phone model was used; different devices may show different band behavior and bar mapping.
- Tethering hardware (Oppo Find N6) used in some tests is not optimized for US bands and could depress tether performance.
- Carrier deployments change frequently—repeat tests over months and seasons for procurement decisions.
Next steps and data availability
Make tests repeatable and measurable: export raw nPerf CSVs, map samples against GPS, and compute percent time above your usable thresholds. Visualizations that help procurement teams: route heatmaps of dBm, a timeline of network type (LTE/NSA/SA) and a chart of percent time at usable throughput per carrier.
If you want to run a pilot, start with the 5‑point checklist above, capture at least three runs per route, and demand SLA language tied to measurable KPIs rather than iconography. Practical, reproducible data beats marketing when connectivity is part of the job.
Three one‑line takeaways for executives
- T‑Mobile leads on 5G presence (including standalone 5G pockets); Verizon leads on raw signal strength and usable connectivity on this route.
- For field ops, prioritize dBm, latency and percent time at usable throughput over a 5G badge.
- Run device‑level tests on your actual routes and hardware, and bake measurable SLAs into procurement decisions.
SEO note (suggested)
Meta title: Rural 5G Reality Check: 15‑Hour Test of AT&T, T‑Mobile & Verizon
Meta description: Real‑world 15‑hour test across Iowa and Wisconsin: T‑Mobile showed the most 5G presence while Verizon delivered stronger, more usable signals—what field teams should prioritize.