11
« on: 04:53 AM - 03/08/18 »
I've seen a few posts relating to folks using consoles changing from the default 125Mhz to 1000Mhz Polling rate and wondered why, given that consoles are limited in what rate they are able to process.
Having just received the Apex, and having used it with the default 125Mhz setting, it appears to be working okay. Likewise, changing it to 1000Mhz and altering the sensitivities accordingly also appears to work okay with no noticeable difference in performance - neither are really able to let me move the mouse cursor in small circles, they will only allow me to move it in small squares, and moving it in the diagonal appears to be somewhat fruitless as all I get are steps!
I have been using a Venom X4, and in fairness I would say that the cursor movement with that is more precise in that I am able to do small circles and it does move in the diagonal, although nowhere near as good as a mouse on the PC.
So, the question is, if changing the polliing rate to 1000Mhz requires that the sensitivity is reduced, isn't is just as good to stick with the default polling rate because in essence, all you are doing by adding that extra sensitivity is knocking it back to the same kind of level afforded by the 125Mhz polling rate? I appreciate that the mouse reports every 1ms at 1000Mhz, but if the console isn't able to keep up with this, and if you are stripping away some of the information in any case by altering the sensitivity, then surely it would be best to keep the polling rate to what the console can keep up with without it having to ignore the data.