The Qlipp’s Accuracy – Any Improvement?
It’s been a while since our original Qlipp v Sony accuracy hit session. In early September, the Qlipp developers released a new update which, among other things, was supposed to improve the serving algorithm. The Qlipp’s accuracy was called into question during our last test, with only 69 out of 100 shots correct. Sony, on the other hand, got everything right.
A month on, let’s run the numbers again.
For those not familiar with the first test, we hit 100 balls with the Sony Smart Sensor and Qlipp sensor loaded into the same racket for the same hit session. We recorded the entire session on an iPad placed on the center line, close to the net.
We were able to identify every shot that the Qlipp picked up by connecting it to a Samsung phone placed close enough to the iPad so as to hear the ball speed being read out.
Again, we’re only interested in shot and spin identification. In our opinion, the ball speed and sweet spot are secondary, and certainly useless if the shot type and spin type are inaccurate in the first place. There’s a little wiggle room when we’re talking about ball speed and sweet spot accuracy. But the Qlipp’s accuracy (or any sensor’s accuracy for that matter) will live and die by the shot and spin identification in the first place.
In the original video, we were interested in the readings given by each sensor for each shot of the 100. This time around we’ve simplified it a little……for a couple of reasons.
Firstly, let’s not bore everyone with another 11 minute video that looks like the original. Second of all, as shown in the results below, Sony got 100% correct again. It seemed a little silly to just add another 100 shots to the original video.
Instead, we’ve identified a couple of “clumps” of shots that clearly shows some problems with Qlipp’s accuracy.
It’s not good news.
The changes made to the serving algorithm in the last update seem to have reduced incorrect serve readings. In the original test, 8 groundstrokes were read as serves. This time, there’s only 1.
But that’s the shining light in an otherwise shocking result.
From the original test, 69 out of 100 shots were read correctly (shot type and spin type). This time around, we’re back to 59 out of 100 correct.
And the mistakes are all over the place. Shots have been missed. Forehands misread as backhands. Slice misread as topspin. There’s no rhyme or reason to most of it.
Even more alarming is the increase in the number of shots deemed to have “flat” spin. The Sony Smart Sensor is the only sensor to have varying degrees of slice and topspin, rather than bothering with an attempt at a flat reading. It’s something about which we’ve written here.
Rather than go into the detail again, let’s just say that not one of the shots played in this video, or the original, were hit flat. All were hit with varying degrees of topspin or slice.
According to Qlipp, bang on 50 of the shots in this video were flat, up from 30 in the last video. We’ve illustrated why this is such a problem with two particular shots in this video.
We’re still fans of the concept, but we’re rapidly losing faith in a product that has potential without the data to warrant it even being on the market. And while Qlipp’s accuracy seems to be worsening, Sony maintains a clean sheet.