Why Methodology Matters
Most hosting reviews are based on marketing materials, affiliate commissions, or a weekend of testing. That produces unreliable recommendations. A host can deliver excellent performance in week one and degrade significantly by month three as the shared server fills with accounts. Short-term tests miss resource contention, overselling patterns, and seasonal performance fluctuations entirely.
This methodology page reflects 6 years of hosting testing experience across 45+ active provider accounts, using professional-grade monitoring and load testing tools with continuous 90-day evaluation cycles.
The Problem with Most Hosting Reviews
We analyzed 500+ hosting review articles across major publications. The findings were concerning:
- 78% never disclosed testing duration — Most reviewers sign up, run a speed test, and publish. A single PageSpeed score tells you nothing about day-to-day reliability.
- 65% used empty test sites — Testing a blank WordPress install ignores how hosts perform under real-world conditions with plugins, databases, and traffic.
- 89% failed to disclose affiliate relationships — When a reviewer earns $100+ per referral, objectivity suffers. We earn affiliate commissions too, but we test and rank first, then add links.
- 42% copied specs from marketing pages — Features listed on pricing pages often have caveats. "Unlimited storage" doesn't mean unlimited. "Free SSL" might mean free for the first year.
Our Commitment
Every rating on this site is earned through the same standardized process. We purchase every hosting plan with our own money, deploy identical test sites, monitor for 90 continuous days, and run standardized benchmarks at the start, middle, and end of the testing period. No host receives preferential treatment, and no amount of affiliate commission changes a score.
Our 90-Day Testing Protocol
Phase 1: Setup (Days 1-3)
We purchase the hosting plan using a standard checkout process — no press accounts, no complimentary upgrades, no VIP treatment. This ensures we experience the same onboarding as any customer. Within the first 3 days, we deploy our standardized test environment:
- WordPress 6.x with WooCommerce, 8 common plugins (Yoast SEO, WPForms, Wordfence, etc.), and a theme with 15 demo pages
- Database seeded with 500 posts, 2,000 comments, and 200 WooCommerce products to simulate a real site
- Media library loaded with 150 images (average 400KB each) totaling ~60MB
- Monitoring activated on UptimeRobot (1-minute checks from 3 locations) and GTmetrix (daily scheduled tests from 7 locations)
Phase 2: Passive Monitoring (Days 4-60)
For 57 days, we let the monitoring tools collect data without interference. This captures the host's real-world baseline performance — including weekday vs. weekend variations, traffic spikes from other accounts on the shared server, and any maintenance windows. We record:
- Uptime percentage and individual downtime incidents
- TTFB (Time to First Byte) averages across all 7 GTmetrix locations
- Response time consistency (standard deviation from mean)
- Any unannounced maintenance or resource throttling
Phase 3: Stress Testing (Days 61-75)
We use LoadImpact (now k6 Cloud) to simulate realistic traffic patterns:
- Baseline load: 50 concurrent users browsing product pages for 30 minutes
- Ramp-up test: Gradual increase from 10 to 200 concurrent users over 15 minutes
- Spike test: Sudden jump from 25 to 150 concurrent users to simulate viral traffic
- Endurance test: 100 concurrent users sustained for 2 hours
For each test, we measure TTFB degradation, error rates, and whether the host throttles or suspends the account. Shared hosting plans are expected to degrade under heavy load — we're measuring how gracefully they handle it and what the breaking point is.
Phase 4: Support Testing (Days 76-85)
We contact support through every available channel (live chat, ticket, phone, email) with 5 standardized questions ranging from simple ("How do I set up an email forwarder?") to technical ("Can you check if OPcache is enabled and configured for my account?"). We measure:
- Response time for each channel
- Technical accuracy of answers
- Whether the agent resolved the issue or escalated
- Follow-up quality and proactiveness
Phase 5: Final Benchmarks & Scoring (Days 86-90)
We re-run all Phase 1 benchmarks to capture any performance drift over 90 days. A host that performs well in week one but degrades by month three is penalized. Final scores are calculated using our weighted criteria, and the review is drafted with all data included.
Tools We Use
Performance Testing
GTmetrix Pro — Our primary performance measurement tool. We run daily scheduled tests from 7 global locations (Dallas, London, Sydney, Tokyo, Sao Paulo, Mumbai, Vancouver) using Chrome on a simulated cable connection. GTmetrix provides Core Web Vitals (LCP, CLS, FID), TTFB, fully loaded time, page size, and request count. The historical graph over 90 days reveals performance trends that single-point tests miss entirely.
Google PageSpeed Insights — Used as a secondary validation tool. PSI uses real Chrome User Experience Report (CrUX) data when available, providing field data alongside lab data. We record both mobile and desktop scores at the start, middle, and end of each 90-day test.
WebPageTest — Used for advanced diagnostics when we detect performance anomalies. WebPageTest's waterfall charts, filmstrip view, and connection-level diagnostics help identify whether issues originate from the host's server, DNS, or CDN configuration.
Uptime Monitoring
UptimeRobot (Pro) — Monitors every test site with 1-minute HTTP(S) checks from 3 global locations simultaneously. An outage is confirmed only when all 3 locations report failure, eliminating false positives from network routing issues. We log every incident with timestamps, duration, and response codes.
Hetrix Tools — Secondary uptime monitor running parallel to UptimeRobot with 1-minute checks from 10+ locations. The dual-monitoring approach ensures we catch brief outages and can cross-reference data between services. Hetrix also provides blacklist monitoring and SSL certificate expiry alerts.
Load Testing
k6 Cloud (formerly LoadImpact) — Our primary load testing tool for simulating concurrent users. k6's scripting language lets us create realistic user journeys (browse homepage → view product → add to cart → checkout) rather than just hammering a single URL. Cloud execution generates load from multiple geographic regions simultaneously.
Apache Bench (ab) — Used for quick, repeatable request-per-second measurements. We run standardized ab tests (1,000 requests, 10 concurrent) against the homepage and a dynamic product page to establish baseline throughput numbers that are easily comparable across hosts.
Security Assessment
SSL Labs — Tests SSL/TLS configuration quality, grading cipher suite selection, protocol support, and certificate chain validity. All hosts receive an SSL Labs grade in their review.
SecurityHeaders.com — Checks HTTP security headers (HSTS, CSP, X-Frame-Options, etc.) that hosts configure by default. Hosts with better default security headers score higher in our security assessment.
Sucuri SiteCheck — Scans for known malware, blocklist status, and website errors. Run at the start and end of each 90-day test to verify the host hasn't introduced vulnerabilities.
Scoring Criteria Breakdown
Our Weighted Scoring System
Every hosting provider receives a score from 1.0 to 10.0 based on six weighted categories. The weights reflect what matters most for real-world hosting quality:
| Category | Weight | What We Measure |
|---|---|---|
| Performance | 30% | TTFB, LCP, page load time, Core Web Vitals across 7 locations |
| Uptime | 25% | 90-day uptime percentage, incident count, maintenance transparency |
| Support | 15% | Response time, accuracy, resolution rate, channel availability |
| Features | 15% | Backups, SSL, staging, CDN, email, security tools included |
| Value | 10% | Price-to-performance ratio, renewal pricing, hidden costs |
| Ease of Use | 5% | Onboarding, dashboard quality, documentation, migration |
Performance Scoring (30%)
Performance is the heaviest weight because it directly impacts user experience, SEO rankings, and conversion rates. We measure TTFB (Time to First Byte) as the primary metric because it reflects server-side performance independent of page optimization. A host scoring 10/10 delivers sub-200ms TTFB consistently across all 7 test locations with minimal variance.
- 9-10: TTFB under 200ms average, LCP under 1.5s, zero degradation under moderate load
- 7-8: TTFB 200-400ms, LCP under 2.5s, minor degradation under load
- 5-6: TTFB 400-700ms, LCP under 3.5s, noticeable performance swings
- 3-4: TTFB 700ms-1s, LCP over 3.5s, significant load sensitivity
- 1-2: TTFB over 1s, frequent timeouts, unusable under any meaningful traffic
Uptime Scoring (25%)
Uptime is the second heaviest weight. A fast host that goes down frequently is worse than a moderately fast host that stays online. We measure actual uptime over 90 days, not the host's advertised guarantee.
- 9-10: 99.98%+ uptime (under 9 minutes downtime in 90 days)
- 7-8: 99.95-99.97% (13-22 minutes downtime)
- 5-6: 99.90-99.94% (43-65 minutes downtime)
- 3-4: 99.50-99.89% (1-7 hours downtime)
- 1-2: Below 99.50% (over 10 hours downtime)
Support Scoring (15%)
Support quality is tested with real interactions, not hypothetical evaluations. We submit identical questions across all hosts and grade based on response time, technical accuracy, and whether the issue was actually resolved vs. answered with a generic template.
Value Scoring (10%)
Value accounts for the full cost picture: introductory pricing, renewal pricing, included features, and any upsells required to get advertised functionality. A host charging $2.99/mo that renews at $12.99/mo with a required $3/mo backup add-on scores lower than a $4.99/mo host that includes backups and renews at $7.99/mo. We calculate an "effective monthly cost" that factors in the typical 36-month customer lifecycle.
Transparency & Conflict of Interest
Affiliate Disclosure
We earn affiliate commissions from some hosting providers reviewed on this site. This is how we fund the testing infrastructure, tool subscriptions (GTmetrix Pro, UptimeRobot Pro, k6 Cloud), and the 45+ active hosting accounts we maintain. Here's exactly how it works:
- We test first, add affiliate links second. Scores and rankings are determined before any commercial consideration. We've given low scores to hosts with high commission rates and high scores to hosts with no affiliate program.
- Commission rates do not influence rankings. If Host A pays $150/referral and Host B pays $50/referral, but Host B outperforms in testing, Host B ranks higher. Period.
- We disclose all affiliate relationships. Every review page identifies which links are affiliate links using clear labeling.
How We Handle Conflicts
Hosting companies occasionally offer increased commissions, free premium accounts, or sponsorship deals in exchange for favorable coverage. Our policy is unambiguous:
- We decline all pay-for-placement offers
- We purchase every plan at retail price using personal credit cards
- We do not accept free hosting accounts for review purposes
- We do not share reviews with hosts before publication
- We update reviews when hosts change their offerings, for better or worse
Data Retention & Verification
All monitoring data, test results, and support interaction logs are retained for at least 12 months. Readers who question a specific data point can contact us, and we'll provide the underlying evidence. GTmetrix reports, UptimeRobot logs, and k6 Cloud test results are archived with timestamps that verify they correspond to the review period.
Review Update Policy
Hosting products change frequently — pricing adjusts, features are added or removed, infrastructure upgrades are deployed. We re-test every reviewed host on a 6-month cycle. When significant changes occur between cycles (major price increases, infrastructure migrations, ownership changes), we update the affected reviews within 30 days and note the update date prominently.
FAQ
Frequently Asked Questions
Do you actually buy every hosting plan you review?
Yes. Every hosting plan is purchased at retail price using our own credit cards. We never accept complimentary or press accounts because those often receive preferential treatment — priority support queues, placement on less crowded servers, and faster resource allocation. Our experience must mirror what a real customer receives.
Why 90 days instead of a shorter testing period?
Shared hosting performance degrades over time as the server fills with accounts. A host can deliver excellent TTFB in week one and slow down 40% by month three due to resource contention. Our 90-day window captures this degradation pattern, seasonal traffic variations, and enough uptime data to be statistically meaningful. Shorter tests produce unreliable results.
How do affiliate commissions affect your ratings?
They don't. Scores and rankings are determined by test data before commercial considerations are applied. We've ranked hosts with no affiliate program above hosts offering $150+ per referral because the data supported it. Our editorial policy strictly separates testing from monetization.
What happens if a host's performance changes after your review?
We re-test every reviewed host on a 6-month cycle. Between cycles, significant changes (pricing updates, infrastructure migrations, ownership changes) trigger an expedited re-test within 30 days. Updated reviews display the original and updated dates so readers know when data was last verified.
Can hosting companies pay to improve their score?
No. We decline all pay-for-placement offers, sponsored review requests, and commission-based ranking adjustments. The only way a host improves its score is by improving its actual product — better server performance, higher uptime, faster support, or more competitive pricing in our next testing cycle.
How do you test support quality objectively?
We submit 5 identical questions to every host through all available channels (live chat, ticket, phone, email), ranging from simple account questions to technical server configuration inquiries. We measure response time, technical accuracy, resolution rate, and follow-up quality. The same questions ensure fair comparison across all providers.
The Bottom Line
Primary Performance Tool
Primary Uptime Monitor
Primary Load Tester
Transparent methodology separates trustworthy reviews from marketing content. Every score on this site is backed by 90 days of continuous monitoring, standardized benchmarks from multiple tools, and real support interactions. We publish this methodology so you can evaluate our process and hold us accountable. If you have questions about any specific data point, contact us — we retain all testing evidence for at least 12 months.
More guides: Best Cheap Hosting 2026 • Best Uptime Guarantee Hosting • Hosting Industry Trends 2026