How We Test
The protocols, rating categories, and data sources behind every review, listicle, and country guide on Earth SIMs.
A review is only as trustworthy as the process behind it. This page documents how we collect and interpret the data that ends up as a rating or recommendation on Earth SIMs. It is deliberately granular — if you want to know exactly what the "4.3" next to a product name means, this is the page.
1. eSIM Testing
Test protocol
- Devices: we test activation on at least two devices per review — typically an iPhone (iOS 17+) and a Pixel or Samsung Galaxy (Android 14+). Where a provider claims specific device support (e.g., Apple Watch, iPad), we also test that device.
- Activation: we time the activation from first click on the provider's checkout to first successful data connection. We note whether QR code, app-based, or direct install is used, and whether the SIM auto-activates at first connection or requires a manual step.
- Speed testing: we run Ookla Speedtest three times per test location — morning, afternoon, and evening local time — and report median download, upload, and latency for each location. Test locations include at least one major city, one secondary city, and one rural or transit area.
- Coverage check: we record signal behavior across at least three carriers in the country where possible, since eSIM providers may roam onto different local networks.
- Testing window: reviews rest on a minimum of 14 days of use per provider. Multi-country reviews rest on a minimum of 4 weeks.
- Support probe: we send one real support request per provider — typically about a real issue (e.g., a stuck activation) — and record response time and usefulness.
Rating categories (eSIM reviews)
- Speed & reliability. Median download speed against expectations for the country, variance across locations, dropout rate.
- Coverage & carrier choice. How many local carriers the eSIM can roam onto; whether the user can choose or is locked to a single partner network.
- Pricing. Per-GB cost against the country median; plan flexibility (top-ups, refunds, unused-data handling).
- Setup & usability. Activation time, app quality, how the plan's expiration and data balance are exposed to the user.
- Support. Response time, channel availability, usefulness of the actual response.
2. VPN Testing
Test protocol
- Testing window: minimum 21 days of real use per VPN across at least two countries.
- Network conditions: we test on home fiber, public café WiFi, and mobile tethering. Each condition is tested in both the VPN's recommended protocol and an alternative protocol (WireGuard vs. OpenVPN vs. the provider's proprietary protocol).
- Speed: we measure baseline (no VPN) and connected speeds to at least three server locations — one in-country, one in the EU, and one in North America — and report median download, upload, and latency.
- Leak tests: we run DNS leak, WebRTC leak, and IPv6 leak tests on every review and report results.
- Streaming: we test access to at least Netflix, BBC iPlayer, and a region-locked bank login from a VPN server outside the service's home region.
- Kill switch behavior: we cut the VPN tunnel mid- transfer and verify that traffic stops rather than leaking over the native connection.
Rating categories (VPN reviews)
- Speed. Percentage of baseline throughput retained under VPN, variance across servers.
- Privacy & audits. Independently audited no-logs policy, jurisdiction, warrant canary or equivalent.
- Streaming. Reliability unblocking the major services above, not just advertised compatibility.
- Apps & platform coverage. Feature parity across macOS, Windows, iOS, Android, Linux, routers.
- Pricing. Effective monthly cost on the most common 1-year or 2-year plan; transparency of renewal pricing.
3. Travel Insurance Evaluation
Travel insurance is YMYL content — mistakes can affect a reader's financial wellbeing. We do not rate travel insurance the way we rate consumer hardware. We evaluate against the policy document itself.
Test protocol
- We read the full policy wording, not the marketing copy.
- We run each policy against a fixed set of realistic nomad scenarios (medical emergency in SEA, laptop theft in Europe, trip cancellation due to a named hurricane, pregnancy-related complication abroad). We record how the policy responds to each.
- Where possible we aggregate verifiable claim experiences from real policyholders and flag patterns.
- We never recommend insurance without explicitly naming the scenarios it does not cover.
Rating categories (travel insurance)
- Medical coverage depth. Limits, deductibles, repatriation.
- Nomad fitness. Multi-country coverage, long-duration policies, home-country coverage rules.
- Claim experience. Documented response times and user-reported outcomes.
- Exclusions. Named exclusions weighted by likelihood of hitting a typical nomad.
- Pricing. Cost per 30 days of cover against comparable policies.
4. Mobile Hotspot & Hardware Reviews
- Testing window: minimum 14 days of field use.
- Battery: we measure real runtime under a standard workload (single streaming session + email + light browsing) rather than relying on the manufacturer's idle rating.
- Signal: we test in at least three signal environments — strong (city-center), moderate (suburban), and weak (rural, or behind obstructions) — and report connection stability, not just peak throughput.
- Concurrent clients: we load each hotspot with the number of devices a typical nomad would connect and measure throughput degradation.
5. Country Connectivity Guides
Our country guides combine on-the-ground testing with publicly available authoritative data.
- Ground testing: wherever a country guide claims specific speeds or venue-level detail, we have been there during the stated testing window. We publish the dates and cities we tested.
- Cross-reference data: country-level speed medians and rankings come from Ookla Speedtest Global Index, cross-referenced against Cable.co.uk's Worldwide Broadband Speed League and, where available, Opensignal mobile experience reports.
- Pricing: carrier and eSIM pricing is verified from the vendor's own page on the publish or update date and marked with that date.
- Regulatory claims: rules around data caps, SIM registration, or VPN legality are sourced from the country's telecom regulator or a comparable primary source.
The aggregated dataset behind our country guides is published openly at earthsims.com/data/, with sources and date ranges.
6. Our Rating Scale
All ratings on Earth SIMs use a 0–5 scale. Rating categories are weighted differently per product category (see above). Overall scores round to one decimal.
| Score | What it means |
|---|---|
| 4.5–5.0 | Best in class. We recommend without meaningful reservations. |
| 4.0–4.4 | Strong. Recommended for most use cases with specific caveats noted. |
| 3.5–3.9 | Acceptable. Works, but the product has a real weakness a reader needs to know about. |
| 3.0–3.4 | Below average. Only recommended for narrow, specific situations. |
| Below 3.0 | We do not recommend. The review explains why. |
7. When We Update Reviews
- Flagship reviews (Saily, Holafly, NordVPN, Surfshark, Proton VPN, SafetyWing) are re-tested at least once per quarter.
- Country guides are re-verified at least twice per year, and any time a major carrier event (new 5G rollout, carrier merger, plan overhaul) warrants.
- "Last updated" and "Last tested" dates on every review reflect the most recent verification.
8. What We Do Not Rate
- Products we have not tested. If a product is mentioned in our content without a rating, we have not put it through the full protocol.
- Vendor claims we cannot verify. If a vendor advertises a feature we cannot independently observe, we either omit it or quote it and attribute the source.
Questions
Questions about a specific test, or want to request that we put something on the test bench? Email editorial@earthsims.com.