An explanation of the CyclingTips product review process

by Matt Wikstrom


Our product reviews always attract a lot of attention so we thought it was about time to detail the effort that goes into every report. It’s immense, to say the least, and it keeps going until well after the review is published.


The humble product review has been a mainstay of the print media for decades, covering everything from books and films to appliances and cars. And as society has become more consumer-oriented, product reviews have only grown more important, such that most shoppers will defer to them before making a final decision.

For a sport like cycling, where participants are entirely dependent on specialised equipment to pursue the activity, the importance of product reviews is entirely understandable. Long before professional reviews were published in magazines, enthusiasts were sharing “user reviews” of their equipment.

As important as product reviews are, there is an amount of underlying skepticism regarding the honesty and integrity of at least some product reviews. After all, there is no peer-review process to test the veracity of any reviewer’s claims. In addition, the line that separates a product review from advertising copy is a blurry one at best, made more difficult by the economic pressures of modern publishing.

Ultimately, the honesty and integrity of any product review lies with the author and publisher. This is something we’ve always taken very seriously, and while our standards are high, they have been shaped by the expectations of our audience.

In the beginning

Any cycling publication that depends upon advertising dollars from the industry is susceptible to a conflict of interest when it comes to reviewing products. This was something that CyclingTips’ founder, Wade Wallace, was acutely aware of during the early years, so he went to the trouble of appointing an independent tech editor.

From that point on, our product reviews were segregated from CyclingTips’ commercial operations, giving our tech team all the freedom they needed to independently assess products. We’ve also maintained a high level of transparency by declaring any perceived conflicts of interest associated with the brands supplying products for our reviews.

It is worth stressing CyclingTips does not charge manufacturers and/or distributors to review its products. The decision to review any given product is left entirely in the hands of our tech team and is related to opportunity, appeal, and relevance.

Most products are delivered in boxes and require some assembly before they can be put to use. This provides a good opportunity for our reviewers to judge the serviceability of each product.

An overview of our review process

The aim of the product reviews published by CyclingTips is to give our readers a good idea of what it would be like to live with the product in question. At the same time, we endeavour to provide information on the company behind the product along with some insight into the effort that went into creating it.

First and foremost then, the key criterion for any review is to spend some time using the product as it was intended. The amount of time varies according to the product and the availability of the product for review. We typically devote 2-3 weeks (20-30 hours of riding) to every bike and wheel review while other products (e.g. shoes, tools, helmets etc) often require less time to assess.

Since our review period is relatively brief compared to the expected lifespan for most products, we cannot make any judgement on the longevity and/or durability for any product (unless it fails in some way). At face value, this is a profound weakness, but even if we devoted extra time to a long-term review, our report would be anecdotal at best because of the tiny sample size (n=1 for almost all reviews).

We can bypass some of the limitations imposed by this sample size by providing a forum for our readers to share their experiences with the product, which is why we leave every review open to comment. In this way, we can enlarge the sample size under scrutiny while entertaining alternative views on the product.

Every reviewer takes the time to make sure every bike fits them properly, so they always keep a few stems on hand to help with this.

Lab coats and slide rules?

It’s fair to say that product reviews have become more sophisticated in recent years. Where once they were purely descriptive, now these reports can contain a variety of data that can be compared for competing products. Some of these traits, such as the weight of the product, can be reasonably easy to measure, while others (e.g. aerodynamic drag) demand specialised equipment that can be difficult and/or expensive to access.

Thus, economic considerations play an important role in deciding whether we carry out any kind of lab testing, however we don’t see much point in carrying out a battery of tests when there is only one sample provided for review. After all, the cornerstone of any reliable and meaningful lab test is a determination of sample variation, which cannot be achieved in the absence of multiple samples.

There is another challenge associated with lab tests, namely the relationship that those tests have with the real-world experience. Riding and racing a bike is a multi-faceted experience that has yet to be reduced to a finite number of measurable traits.

Indeed, our understanding of cycling has barely progressed beyond the point were the fundamentals are understood so the relevance of any measurable trait — be it weight, stiffness or aerodynamic drag — remains open to interpretation (and sometimes fierce debate). That isn’t to say that such information is worthless, but we do not see it being any more reliable or credible than an honest account of the bike’s performance.

An honest subjective account

CyclingTips has long embraced the subjectivity of the review process. We put the product to use just like any buyer and we provide an honest account of our experience. In this regard, our reviews aren’t that far removed from any other user review, but there are a few things that we bring to this process that do a lot to elevate our reports.

First and foremost, there is the experience of our tech team. James Huang, Matt Wikstrom, David Rome and Dave Everett have accumulated almost a century of road cycling between them, and while they aren’t decorated champions, they have experienced the best and worst that the sport has to offer. This experience has also been gained in the context of an ongoing evolution in equipment, which affords them an invaluable perspective on what new products have to offer.

Each member of our tech team is an experienced mechanic, too. This kind of background is crucial for understanding the performance of a product in a practical sense (and probably goes a long way to explaining their fascination with the technical aspects of cycling). It also adds another level to their perspective on the performance of a product that is lacking for a typical user.

Each member of our tech team now has several years reviewing products for CyclingTips (and in some cases, other publications), and that experience alone counts for a lot. A concentrated effort is required to assess the performance of any product in the first place, but to effectively communicate it adds further to the challenge. Anybody that has ever tried to sum up what it was like to ride somebody else’s bike will understand that this is not an easy thing to do.

The time that has been spent by each of our tech team assessing products has had a profound effect on their scope of understanding. As a result, we feel that our tech team is well placed to understand and assess the performance of all of the products that the market has to offer, both in absolute and relative terms. They also have developed the skills to express and sum up their experience in clear and concise language despite the fact that some aspects of a product’s performance can be intangible and fleeting.

There is an argument that a product review should challenge and/or validate a manufacturer’s claims. While we agree with this notion, it is fraught with difficulties, since the original testing conditions may be ill-defined and/or difficult to replicate. Where we have confidence and access to suitable equipment, we will conduct this kind of assessment, but our scope for this kind of testing is relatively limited.

Be that as it may, we have a greater imperative to assess and report on how the product performs under real-world conditions. As such, there is no need to challenge marketing hyperbole per se when an honest and balanced account of our experience will serve as a valuable counterpoint for any manufacturer’s claims.

Selecting candidates for review

Our tech team selects all of the products that are reviewed by CyclingTips on the basis of relevance and appeal to our audience. In some instances, products are brought to the attention of our team by the brands themselves, and in others, we approach the manufacturer/distributor directly.

Importantly, this is done outside of the context of any commercial arrangements. We have found that manufacturers and distributors have a healthy respect for the integrity of our review process and afford us the freedom we need to assess the product.

It is important to note that not all of our requests for products to review are accepted. This is not something that we have any control over however it can create the impression that we have a bias for reviewing certain brands while ignoring others. This is an unfair assumption for any reader to make and is in direct contrast for the enthusiasm our tech team has for the range of brands and products that populate the marketplace today.

Our reviewers will go as far as dismantling a product if it will add extra depth to the report.

The testing process

If it isn’t evident from the comments above, our tech team are all self-confessed bike-geeks that thrive on the technical aspects of our sport. This fascination has lead them to experiment with the performance of a range of products well before they started preparing formal reviews for CyclingTips.

While it is tempting to propose a systematic protocol for testing any and all products, our tech team has been given the freedom to decide how best to assess each product. Interestingly, all have developed a similar approach by taking advantage of familiar roads and regular routes, which goes a long way to standardising their testing regimen, however this approach isn’t always possible (for example, during bike launches in foreign countries).

James, Matt, and David Rome all like to limit the number of variables by making use of the same saddle for each bike and a set of reference wheels and tyres, while our roving reviewer, Dave Everett, doesn’t have the freedom (or luggage space) to carry around such items.

Our tech team is always out on the road and it’s fair to say that they spend most of their time riding everybody else’s equipment rather than their own. Each product is put to use as if it was their own, and over time, our reviewers build up an impression of the product.

The prevailing weather conditions can sometimes limit the range of challenges that the product can be subjected to, however our reviewers always endeavour to use the product under a range of circumstances to assess the scope of its performance. This includes riding different kinds of terrain and making a variety of efforts.

In preparing their reports, our tech team generally assesses the product in absolute terms — how well does it serve its intended purpose? — and where relevant, against similar products. Wheelsets, for example, may be compared to a standard low-profile alloy wheelset in order to judge stiffness, ease of acceleration, and aerodynamic performance. Similarly, a conventional mid-range non-aerodynamic carbon bike may serve as a point of comparison for a bike review. Importantly, the reviewer will detail any comparison during the course of the report.

Our long review format, which awards scores, is used for all bikes and wheels, and on occasion, products like GPS devices, powermeters, trainers, and clothing collections. This format provides considerable detail on the product as well as a lengthy description of its performance. Final scores (out of 10) are awarded for five separate categories, as detailed in the graphic below:

CTech metric_small

Products are automatically awarded 6/10 for each category if they are reasonably easy to use and they don’t fail during the review period. Additional points are awarded based on the impressions of the reviewer. A final overall score is also calculated that gives different weights to each category, as shown above.

It is important to note that our scoring system is largely subjective and is not comparable with any other scoring system. And while it serves to summarise key aspects of the product’s performance, the scores are best considered an addendum to the reviewer’s report (though they provide a sound point of reference when comparing our reviews for competing products).

Our short review format is used for describing and assessing the performance of various accessories, such shoes, helmets, sunglasses, tyres, etc. These products are typically subjected to the same range of testing conditions, however the same amount of time devoted to a long review may not be needed to arrive at a conclusion.

While it isn’t strictly necessary to provide beautiful photographs of the products we review, we feel it makes the report a whole lot more interesting to read.

Healthy conversation

As mentioned above, we have always welcomed feedback on our reviews. Indeed, we look to our readers to challenge our view with the questions they have to ask and their criticisms of the review or product. We don’t see the moment that we publish a review as the end of our job; rather it marks the start of a conversation with our audience.

Any experience with the product under review will always be worth sharing with our tech team and audience. It’s the most effective way we have to increase our sample size, widen our testing conditions, and diversify our perspective on the performance of the product. The resulting discussion always adds extra depth to our reviews, but we can’t generate this kind of content on our own, so we’ll always be grateful for your help.

We’re also very proud of the fact that representatives from the brands involved often respond to the questions and issues raised by our readers. This is one of the strengths of being a web-based publication and it adds extra depth to our reviews.

Of course, we always enjoy any words of encouragement, and where some constructive criticism might be warranted, we’re also happy to hear from you. All that we ask is that you treat us fairly.

Questions or comments?

If you’ve got any questions or comments about the review process, you can contact us at editor@cyclingtips.com.au or share your thoughts in the comments section below.

Editors Picks