You’ve read our laptop reviews. You’ve read our conclusions. And now you’re wondering how we came to them.
Good question. Reviews often lack context, which is evident in the wildly different scores some laptops receive from different publications. Conflicting opinions can actually make choosing a laptop more difficult if the review’s criteria aren’t made clear.
Allow us to lift the veil. Here we’ll explain the benchmarks we use for objective testing and the perspective from which we approach subjective topics. We don’t expect everyone to agree with our opinions about what the best laptops are, but we hope that sharing our process will leave you better equipped to decide what laptop best fits your needs.
The hands-on experience
You can tell if you like the aesthetic of a laptop just by some photos, which is why we always provide them. You might appreciate the tone of the color, the intricate spacing of elements on the keyboard deck, or even just the visual design of the lid.
But laptops are tactile products. They’re meant to be carried from place to place, and how they feel in your hands or in your backpack, matters quite a bit. These senses of sight and touch allow us to make first judgments about the
We strive to describe both the materials used in a laptop’s construction and how those materials hold up in real-world scenarios. Thickness, weight, and other dimensions apply here too. There’s always a tradeoff, and how manufacturers balance these variables is key.
During our time with a laptop – usually one or two weeks – our initial impressions are tempered by the passage of time. A finish that was at first beautiful and unique may become annoying if it attracts dirt and fingerprints too easily, and a design that seemed mundane may grow on us through its utility.
Ultimately, hands-on impressions are subjective, no matter how much time we spend with each laptop. However, our experience handling many laptops gives a unique perspective on these products, making it possible to develop informed opinions about where each product we review stands against the competition. At the least, we want our readers to leave a review with a strong idea of how a laptop looks and feels in the real world.
Interface interaction
The quality of the keyboard and touchpad is always important, and we devote an entire section to these vital user-interface tools.
We look for keyboards that offer solid key feel. To be more specific, we look for keys with a crisp action that quickly rebound when a finger is removed. Keys should not wobble or skew when pressed along a key’s side instead of the center, and there should be no flex along the width or length of the keyboard when a key is completely depressed.
Touchpads should have a smooth and precise feel that shouldn’t cause your finger to skip or drag across it. Good palm rejection means you shouldn’t get sudden cursor skips either. Whether a touchpad uses a physical button or haptic feedback, we look for touchpads that have quiet and
Most of our reviews barely mention touchscreen quality because most implementations provide nearly identical feel. Instead, we spend time talking about related features like a convertible laptop’s hinge or a touchscreen all-in-one’s software.
Display and audio impressions
Though the design of a laptop is in the eye of the beholder, the display and audio systems on these products straddle the line between what is subjectively pleasant and what can be objectively measured.
We attempt to incorporate a bit of both into our judgment of these components. Using the laptop naturally reveals the quality of the display, but there are also tests used to provide a measurable impression. We use the Spyder5Elite color calibration tool and its built-in quality measurement suite to test the display’s brightness, contrast, color gamut, color accuracy, and gamma curve. If it’s a HDR-capable screen such as an OLED or mini-LED panel, we make sure to try out HDR content too.
Audio quality is judged by a number of subjective tests. A typical benchmark includes YouTube HD, podcasts, and streaming music. During our tests, we adjust the volume to see how (or if) performance degrades as the speakers become louder.
The test chamber
Most of our judgments take place during real-world use. For example, we usually use the laptop being reviewed to actually write the review, meaning the reviews you read on our site are written on the laptop pictured in the review’s photos. When it comes to performance benchmarks, however, each laptop has to spend some time alone, cranking through an array of tests.
Our processor suite includes:
- GeekBench (single-core and multi-core)
- Cinebench R23 (single-core and multi-core)
- PCMark 10
- Handbrake (encoding a four-minute, 20 second 4K trailer into H.265)
Our hard drive suite includes:
Our graphics suite includes:
- 3DMark Time Spy
- Fortnite
- Assassins Creed Valhalla
- Red Dead Redemption 2
- Cyberpunk 2077
We use FRAPS, a well know benchmark program, to take accurate framerate readings. Interpretation of the results matters as much as the numbers themselves.
A lasting impression
We use three tests to judge battery life. In all situations, we calibrate the display’s brightness level to 100 lux using a light meter, and also disable any power settings that might dim or turn off the display during testing. We record battery life results using Windows’ built-in battery recording feature.
First, we have our iMacro test. This uses the iMacro extension for Chrome to load several websites in a loop. A pause between each load provides downtime. This better simulates how real users browse the web.
Secondly, we run our video test, which plays a 1080p movie clip on a loop using the built-in media player until the battery dies. This tends to be the least demanding test in our suite.
Hot stuff
Heat is always an issue for laptops. Fast processors give off plenty of warmth while operating, but the slim frame of a laptop leaves little room for airflow. The way a notebook deals with the buildup of heat directly impacts usability.
Ideally, a laptop should not warm significantly on either the top or the bottom, but it’s rare that this is the case. We take note of where a product warms as we use it both on a desktop and in our laps and measure hot spots with an infrared thermometer. The results are often referenced in our reviews.
In addition to this real-world testing, we use stress test programs such as 7-Zip Benchmark and all-core Cinebench test to simulate the maximum possible load that a laptop might encounter. While doing this, we also make note of reported CPU and GPU temperatures to see if they become hot enough to be a potential source of instability.
We also measure fan noise during our temperature tests. We use a decibel meter in an environment where ambient noise does not exceed 38 decibels. Noise is measured during idle, at full CPU load, and at full GPU load.
Reaching a verdict
The most difficult part of every review is the verdict. This is where we decide if we’re going to recommend a laptop and determine how the outcome of each section fits together to form a final score.
Verdicts are usually handed down from the perspective of what the laptop is built to accomplish. Poor battery life on a gaming laptop won’t significantly impact the score, but an ultraportable with the same problem could lose several points.
Competition must also be considered. Laptops are becoming better with each passing year as each brand tries to better its peers. Most of today’s
Value is also important. We don’t expect to see a high-resolution display and discrete GPU in a laptop that ships at $500, and we won’t knock it for lacking those features. A laptop that costs $1,500, however, will lose points if it skimps on hardware.
In the end, it’s not about everyone agreeing with our final verdict. We know that’s impossible. We do hope, however, that in reading our laptop reviews, you understand the logic behind how we arrived at those conclusions — and in theory, that’ll help you decide what the best laptop is for you.
Editors’ Recommendations