|
FULL EDITED POST
Do you feel struggle while aiming? well, not your fault.
This is a factual aspect
For those who have doubts about this command, m_pitch is responsible for the vertical sensitivity speed in the game, as m_yaw is responsible for the horizontal. Both combined determine the sensitivity speed factor.
This command has become protected to prevent abuses in the changing of the values, that caused the servers to crash; And for example in CS:S, If you trigger this on your mouse1 while shooting in, it gives lesser recoil. These are the reasons for the command to be locked.
Taking this into account, these are the following settings m_pitch and m_yaw
"m_yaw" = "0.022"
"m_pitch" = "0.022000" ( def. "0.022" )
People have complained that 0.022000 is different from 0.022
0.022000 which is slightly faster than 0.022
The default value is 0.022, but its locked to 0.022000
Therefore, assuming there is some configuration error
Ok,
Lets convert them:
Decimal Representation - 0.022
Binary Representation - 00111100101101000011100101011000
Hexadecimal Representation - 0x3cb43958
After casting to double precision - 0.02199999988079071
Decimal Representation - 0.022000
Binary Representation - 00111100101101000011100101011000
Hexadecimal Representation - 0x3cb43958
After casting to double precision - 0.02199999988079071
They have equal 32-bit representation values so it is not how they are written in Source Code
So,
where you can find these commands? in a configuration file (a text that game loads) (config.cfg), Wich goes through a different code logic;
Then that means "0.022000" and "0.022" are text that Valve's Code has to read and interpret. It's perfectly possible that Valve's code has a bug in it, when interpreting from the text symbols "0.022000" to the actual number "0.022" ,so the problem might be in fact text-to-float conversion function.
Is this really an issue? Yes, the difference is small but is related to sensitivity which is crucial. This directly affects your ingame performance when it comes to aiming accucary.
brief explanation:
https://i.imgsafe.org/8d2bd74174.png
Console: https://i.imgsafe.org/8d29bc53f9.pn
From 2000 to 2012, CS 1.6
m_pitch 0.022
m_yaw 0.022
From 2004-2012, CS:Source
m_pitch 0.022
m_yaw 0.022
CS:GO
m_pitch 0.022000
m_yaw 0.022
If someone can please check it in runtime, please step foward.
|
What are you saying here? I don't understand the purpose of this thread.
Are you saying not to change these two settings on console?
|
No, I'm saying that the current sensitivity in CS:GO is not being calculated correctly , it means that you can even feel comfortable with your sensitivity but would have better accuracy with it corrected.
The current yaw and pitch values are out of sync; For this, Valve need to correct the current value of m_pitch which is a protected command, we can not change it, only if sv_cheats 1.
|
France9034 Posts
Source Engine is written in CC++ Language, Depending on compiler and the settings made at compile time (fast float compiling) will determine how precise values are. If written 0.0220000 then it may be 0.0220096 or something for example. If written 0.022 then it can only be 0.022.
This is entirely and completely false.
Look for the IEEE754 standard which defines how floating point works (and which is the representation used by all the C++ compilers out there, though the standard specifies that it's up to them), and then come back and show me how the binary representations differ for "0.022" and "0.022000".
Hint: they don't.
Too lazy to check this but I'd bet any sane C++ compiler out there (be it Clang, gcc or even VC++) would just trim the 0 anyway beforehand (when it actually remove space, comments, etc. from a translation unit) in such cases at compile time (and even without aggressive optimization flags set up).
The two values are out of sync visually, but that's it.
EDIT: Here's a working code that shows you that, live on Coliru.
|
|
France9034 Posts
I think you missed my example where the code actually outputs the binary representations that are identical.
And saying "they are not necessary the same" shows a profound misunderstanding of floats and the IEEE754, which is determinist. That doesn't mean that 0.022 has an exact representation, but it will always be rounded to the same one (otherwise, we would have a VERY big problem with floats, as if there wasn't already traps with the most used representation).
Just linking one of the most well known guides about floating points doesn't magically make your argument right. There's nothing about extra 0s in there.
|
Yes, they are identical
so you are saying that 0.022 + 0.022000 = 0.022000 and not 0.022 + 0.022000 = 0.0220096 or something
I must explain this
ingame they act different when 0.022000 is set
if both set to 0.022 vertical and horizontal sensitivity is precise if one set to 0.022 and other to 0.022000 sensitivity is precise but not as much as values above; 0.022 + 0.022000 somehow it is not as accurate as 0.022 + 0.022 checking in earlier versions of the counter-strike m_yaw and m_pitch values it appears that both are 0.022 since all time
It is something related to the engine and the way it handles these settings compile + hardware + outputs which causes this to happen I just did the physical testing and research, I have no concrete evidence to show how both are different . You know somehow it is possible to test both values and prove that work differently ?
|
Unless there is a serious bug in their config parser's floating point value parsing, you are going to have the same representations for 0.022 and 0.022000.
The IEEE754 standard absolutely does not treat those constants any differently. Rounding and precision are (in this case) only going to rely on the underlying float type they chose.
|
France9034 Posts
What I'm actually saying is, if there's a difference, or whatever, this paragraph:
Source Engine is written in CC++ Language, Depending on compiler and the settings made at compile time (fast float compiling) will determine how precise values are. If written 0.0220000 then it may be 0.0220096 or something for example. If written 0.022 then it can only be 0.022.
is just dead wrong, whatever the context.
Any actual difference if provably existing must be explained differently, because this is not how it could happen.
It actually makes your title
(This happens because of: Floating-Point Arithmetic Calculations)
completely wrong as well.
|
[B] ingame they act different when 0.022000 is set
if both set to 0.022 vertical and horizontal sensitivity is precise if one set to 0.022 and other to 0.022000 sensitivity is precise but not as much as values above; 0.022 + 0.022000 somehow it is not as accurate as 0.022 checking in earlier versions of the counter-strike m_yaw and m_pitch values it appears that both are 0.022 since all time
It is something related to the engine and the way it handles these settings compile + hardware + outputs which causes this to happen I just did the physical testing and research, I have no concrete evidence to show how both are different . You know somehow it is possible to test both values and prove that work differently ?
I thinkyou missed my last post, so assuming that this is incorrect , what are the alternatives to this? which may be the source of 0022 + 0.022000 be diferent from 0.022 + 0.022 in terms of accuracy / sensitivity
memory? outputs?
|
How do you know there is a demonstrable difference? Your video doesn't show anything measurable.
It also has a bunch of misleading and/or wrong comments on floating point values and how they are interpreted. (Ex: those two values aren't going to be 16bit vs 32bit floats since the type they get parsed into isn't changing.)
|
France9034 Posts
On July 22 2016 05:53 weqn wrote:Show nested quote +[B] ingame they act different when 0.022000 is set
if both set to 0.022 vertical and horizontal sensitivity is precise if one set to 0.022 and other to 0.022000 sensitivity is precise but not as much as values above; 0.022 + 0.022000 somehow it is not as accurate as 0.022 checking in earlier versions of the counter-strike m_yaw and m_pitch values it appears that both are 0.022 since all time
It is something related to the engine and the way it handles these settings compile + hardware + outputs which causes this to happen I just did the physical testing and research, I have no concrete evidence to show how both are different . You know somehow it is possible to test both values and prove that work differently ? I thinkyou missed my last post, so assuming that this is incorrect , what are the alternatives to this? which may be the source of 0022 + 0.022000 be diferent from 0.022 + 0.022 in terms of accuracy / sensitivity memory? outputs?
That you haven't thought of another explanation doesn't make this one valid automatically.
Also, I think this has been too long already, tapping out.
|
On July 22 2016 06:00 PunitiveDamages wrote: How do you know there is a demonstrable difference?
Because the difference its noticeable ingame, not just me to notice but also many other people wich shared this with
|
On July 22 2016 06:42 weqn wrote:Show nested quote +On July 22 2016 06:00 PunitiveDamages wrote: How do you know there is a demonstrable difference?
Because the difference its noticeable ingame, not just me to notice but also many other people wich shared this with Unfortunately without a quantitative difference that you can measure, its going to be impossible to figure out if there is an actual difference in behavior. And without that you can't find a source for the difference in behavior.
I'll echo again everything Ragnarork said; there is no difference between 0.022 and 0.022000 as far as floating point is concerned. If there is a real effect, it is not being caused by rounding or precision effects of floating point.
|
On July 22 2016 06:25 Ragnarork wrote:Show nested quote +On July 22 2016 05:53 weqn wrote:[B] ingame they act different when 0.022000 is set
if both set to 0.022 vertical and horizontal sensitivity is precise if one set to 0.022 and other to 0.022000 sensitivity is precise but not as much as values above; 0.022 + 0.022000 somehow it is not as accurate as 0.022 checking in earlier versions of the counter-strike m_yaw and m_pitch values it appears that both are 0.022 since all time
It is something related to the engine and the way it handles these settings compile + hardware + outputs which causes this to happen I just did the physical testing and research, I have no concrete evidence to show how both are different . You know somehow it is possible to test both values and prove that work differently ? I thinkyou missed my last post, so assuming that this is incorrect , what are the alternatives to this? which may be the source of 0022 + 0.022000 be diferent from 0.022 + 0.022 in terms of accuracy / sensitivity memory? outputs? That you haven't thought of another explanation doesn't make this one valid automatically. Also, I think this has been too long already, tapping out.
please dont take as an offense but you are not helping... i asked if it is possible to test both values and prove that both setups work differently ?
|
On July 22 2016 06:47 PunitiveDamages wrote:Show nested quote +On July 22 2016 06:42 weqn wrote:On July 22 2016 06:00 PunitiveDamages wrote: How do you know there is a demonstrable difference?
Because the difference its noticeable ingame, not just me to notice but also many other people wich shared this with Unfortunately without a quantitative difference that you can measure, its going to be impossible to figure out if there is an actual difference in behavior. And without that you can't find a source for the difference in behavior. I'll echo again everything Ragnarork said; there is no difference between 0.022 and 0.022000 as far as floating point is concerned. If there is a real effect, it is not being caused by rounding or precision effects of floating point.
Ok there is no difference in floats, numbers, values everything is precise. both setups are precise
but 0.022 feels so much better to control; feels so much precise;
looking into previous versions of counter-strike all versions had it 0.022 + 0.022, and not 0.022 + 0.022000 like CS:GO theres got to be something with the engine/coding some bug i dont know what it can be ( you just made my mind clear about the floats) i want to prove it, how can i prove both values? any programs or macros that would help figuring this out? i would lost my time and other people time if i didn't had 100% about what im trying to say
|
On July 22 2016 07:01 weqn wrote:Show nested quote +On July 22 2016 06:47 PunitiveDamages wrote:On July 22 2016 06:42 weqn wrote:On July 22 2016 06:00 PunitiveDamages wrote: How do you know there is a demonstrable difference?
Because the difference its noticeable ingame, not just me to notice but also many other people wich shared this with Unfortunately without a quantitative difference that you can measure, its going to be impossible to figure out if there is an actual difference in behavior. And without that you can't find a source for the difference in behavior. I'll echo again everything Ragnarork said; there is no difference between 0.022 and 0.022000 as far as floating point is concerned. If there is a real effect, it is not being caused by rounding or precision effects of floating point. Ok there is no difference in floats, numbers, values everything is precise. both setups are precise but 0.022 feels so much better to control; feels so much precise; looking into previous versions of counter-strike all versions had it 0.022 + 0.022, and not 0.022 + 0.022000 like CS:GO theres got to be something with the engine/coding some bug i dont know what it can be ( you just made my mind clear about the floats) i want to prove it, how can i prove both values? any programs ou macros that would help figuring this out? i would lost my time and other people time if i didn't had 100% about what im trying to say
The only option I can think of would be to: - Hook the CS:GO mouse input (obviously this is possible, since various hacks can spoof mouse inputs) - Send a specific mouse movement (you can't do this manually or with a physical mouse) - Record the results with different settings (0.022 and 0.022000) from the same starting point.
Then examine the recordings to see if the ingame crosshair movement is any different. If there isn't a difference then its not a real effect. If there is, then you've found a bug in how CS:GO handles mouse movements.
Good luck.
|
how do i set input mouse?
im not a programmer, is there some sort of program?
|
can you please help me on that? i have no ideia how to do it, i tryed macros but they dont work i had to move the mouse please help me
|
|
Is this a troll? This has been showing up a lot lately.
You've yet to show any proof, quantitative that is. What you're claiming doesn't make much sense and as far as I'm aware it's been debunked in regards to how the engine would parse it so the burden of proof would fall entirely on you.
|
France9034 Posts
No, apparently this thread is fine.
|
Ok, even if (and i do not think that this is so) you have a difference of 0.0000096 (~10⁻⁵) between vertical and horizontal movement. Unless you have a 100K monitor, you will not get a measurable difference. And even then you would not be able to notice the difference yourself without help of some software.
So even if you are right, it would still not matter.
|
On July 26 2016 12:57 Nixer wrote:Is this a troll? This has been showing up a lot lately. You've yet to show any proof, quantitative that is. What you're claiming doesn't make much sense and as far as I'm aware it's been debunked in regards to how the engine would parse it so the burden of proof would fall entirely on you.
that post is confirmed in the 2013 SDK, source sdk 2013 is correct = m_pitch 0.022 m_yaw 0.022 we are in 2016; It has been changed and there are bugs everywhere who are being fixed day after day
This has to do with the fact of how the outputs will handle the floats , Source Engine is not perfect , as you can see trailing zeros make the difference may be slight , which is slight , but still make a difference. an amount that will not notice for casual players but professional players will notice .
Situational example : this is like having a football cleats weighing 500g for other weighing 100g , who will have better performance? probably a difference that casual players do not notice the difference but professional players do. CS:GO is top world e-sport , there needs to be awareness in these aspects so that everything is for the best experience in competitive edge .
|
|
On July 26 2016 19:13 Turi wrote: Ok, even if (and i do not think that this is so) you have a difference of 0.0000096 (~10⁻⁵) between vertical and horizontal movement. Unless you have a 100K monitor, you will not get a measurable difference. And even then you would not be able to notice the difference yourself without help of some software.
So even if you are right, it would still not matter.
And, do not forget that after the float , the sensitivity adds ingame value and windows (if raw input off ) which will cause that there is this slight difference , so yes regardless of screen size , yes , you will notice the difference.
|
|
|
No you will not? I know it is hard to get a feeling for numbers this small, but unless the the difference is somehow exacerbated by a factor of at least a thousand*, your own biological difference of sideways arm movement to forward and backward arm movement are bigger than the difference between these two movement values.
Also how does adding static data like window size have a influence on a simple difference? If you add the date you get the same absolute difference, and if you multiply you get same relative difference. And in this case both are so small that you get no measurable difference not matter what you do to the data.
To stay in your strange example: The difference between your two footballs cleats would not be 400 g. 500 g * 10^-5** are 5 mg. That is easily in spec. Two real footballs cleats will have a higher difference between them, even if they are the same model from the same manufacturer. And if the difference would be a thousand times bigger, you would still not notice a 5 g difference on 500 g football cleats (they would no longer be in spec, but that's precision engineering for you and also on longer relevant for this topic).
And you still have not shown that there is an actual difference btw. The way to measure it would to try to move your mouse*** in a 45° angle. If the resulting line has a different angle than 45°, one direction is measurable faster than the other. This has the nice feature of not needing two additional time measurements, which are less precise than a angle measurement. Of course if you do it by software, a simple "move mouse in a line on x and on y for n seconds and record travel distance" could do the trick if you are very careful.
* I'm spit balling here, i do not have research on what difference in mouse movement can be noticed by a human. But considering that both movements are independent of each other and the hand movement is so different between sideways and for/backwards, i think that this is a save guess.
** I using the absolut difference, not the relative. While at first this felt wrong, we are talking about floating point errors here. The relative difference changes, but the absolut difference will always stay around ~10^-5 for this case,
*** Not by hand, your hand movement is not precise enough for such a measurement.
|
I am Portuguese and I confess that I have to use google translator to understand you 
Also how does adding static data like window size have the influence on the simple difference? The screen size does not influence anything, this aspect is purely around the sensitivity and the screen size will not influence anything
And you still have not shown que there is an actual difference btw Indeed not, the only thing I can give you is the feedback I got in my experiments, as well as people who also did, and confirmed that indeed there is a slight difference. I do not have the knowledge to do this kind of testing "mechanical / programmable" If you possess knowledge to do so please give a step forward, or some of those who are reading eventually.
I do not have research on what difference in mouse movement can be noticed by the human If the noted difference was so small it hardly could feel, this issue was no longer relevant, and I would not waste time trying to expose you. What happens is that in fact the mouse movement speed varies between these two settings is not something exaggerated, is not something tiny, but it is something light. And if this gets fixed we will have the same speed of horizontal mouse movement equal to vertical, which can, and I firmly believe that will surely improve in a certain way the accuracy of each player. This is ugly programming.
Your hand movement is not enough need for such a measurement
If I do not understand programming and coding, how do you explain the fact that I find that there is something different between these two settings then ? Believe me, if you are familiar with the game , you 'll be able to notice that there is a difference with regard to the movement of the mouse speed . I use the same setup for years, I can quickly identify differences; and as I told you I'm not the one to say that there is difference.
|
There could be two factors at play:
a) Placebo effect (that is not a attack against you btw) That is a real possibility here. You have yourself convinced that you have different movement speeds in different directions. Now you feel these differences, even if there is none. Human brains are stupid in this way and are easily fooled. That is why we need actual measurements, not feelings, to determine the truth. That could also explain why there is a small group of people convinced of the same thing but not everyone (like it would be if there was a an actual difference). You have all convinced each other. Group dynamics are horrible.
b) An other bug There could be an other bug that only hits a small group of people and results in different movement speeds.
But i do not believe at all that your explanation could be right. The difference would be just that small. And most likely not even there, considering that it does not make sense for the compiler to keep trailing zeros,
|
I might take a bit of a look out of curiosity later on, I probably should know these things. The initial gut feeling is that the inaccuracy here is negligible compared to all the other inaccuracies you experience due to hardware, mouse surface, drivers and rest of the CS:GO input handling (assuming it actually exists in the first place).
|
I did not took it as offensive, is not placebo effect , I immediately shared it with friends and acquaintances and they can use your me confirm that indeed there is a slight difference. I would like to be wrong about this, because it's quite frustrating to know that something that is working in the wrong way and not have way to scientifically prove because I do not have that kind of knowledge.
I think this has to do with Source Engine and not with external settings , I let my setup anyway : 400 dpi 1.8 sensitivity 6/11 windows pointer speed Enhance pointer precision OFF 500hz refresh rate m_customaccel 0 m_rawinput 0
I do not know if this happens to m_rawinput 1 , but it should happen because this is something related to ingame factors by default and not with windows .
|
Please guys, does anyone know a way to test both performances ? so that it is proven that there is a difference ? please help!
|
On July 28 2016 19:33 Bacillus wrote: I might take a bit of a look out of curiosity later on, I probably should know these things. The initial gut feeling is that the inaccuracy here is negligible compared to all the other inaccuracies you experience due to hardware, mouse surface, drivers and rest of the CS:GO input handling (assuming it actually exists in the first place).
Do you know any way to test both performances in order to prove that there is a difference?
|
|
|
On July 31 2016 22:16 weqn wrote:Show nested quote +On July 28 2016 19:33 Bacillus wrote: I might take a bit of a look out of curiosity later on, I probably should know these things. The initial gut feeling is that the inaccuracy here is negligible compared to all the other inaccuracies you experience due to hardware, mouse surface, drivers and rest of the CS:GO input handling (assuming it actually exists in the first place). Do you know any way to test both performances in order to prove that there is a difference? Find a friend. Explain it to him. Have him set it up one way, and not tell you which one. Test it out, and report your findings. Then have him switch the settings, and test it again. This isn't perfect, because he knows the difference - to minimize any possible effects, minimize any kind of contact with your friend during the experiment (talking to him, etc.).
And stop triple posting...
|
It is very well known that the change in resolution changes the sensitivity, though I fail to see how this backs up your arguments?
Obviously, 1.8 is the same as 1.800000, there's hardly any possible debate about this, that picture merely says that resolution affects sensitivity, which I've personally known since the late 90's/early 2000's by playing Quake etc..
I think you're mixing things up here tbh.
btw you should put rawinput on 1.
|
On August 02 2016 10:43 bduddy wrote:Show nested quote +On July 31 2016 22:16 weqn wrote:On July 28 2016 19:33 Bacillus wrote: I might take a bit of a look out of curiosity later on, I probably should know these things. The initial gut feeling is that the inaccuracy here is negligible compared to all the other inaccuracies you experience due to hardware, mouse surface, drivers and rest of the CS:GO input handling (assuming it actually exists in the first place). Do you know any way to test both performances in order to prove that there is a difference? Find a friend. Explain it to him. Have him set it up one way, and not tell you which one. Test it out, and report your findings. Then have him switch the settings, and test it again. This isn't perfect, because he knows the difference - to minimize any possible effects, minimize any kind of contact with your friend during the experiment (talking to him, etc.). And stop triple posting...
I did that already, but that is trash to valve, they think its a "montage"
I do not know how to prove it scientifically. It is something that you feel the mouse movement with respect to mouse sensitivity . I'm not the only one that I share this opinion , many others who tested also able to identify the difference , so we are players and not programmers, so do not have the necessary knowledge to prove scientifically ; or in some way that makes it clear the difference in the mouse movement speed
people say: 0.022 and 0.022000 are the same number ;
i agree , but 0.022000 feels faster than 0.022
what is the explanation for this? what may be causing this difference in mathematically correct values ?
this is what has to be proven , but how?
|
On August 02 2016 21:31 cptmcgroovy wrote:It is very well known that the change in resolution changes the sensitivity, though I fail to see how this backs up your arguments? Obviously, 1.8 is the same as 1.800000, there's hardly any possible debate about this, that picture merely says that resolution affects sensitivity, which I've personally known since the late 90's/early 2000's by playing Quake etc.. I think you're mixing things up here tbh. btw you should put rawinput on 1.
i think you should read the first paragraph, when i say: "(...)example"
[B]btw you should put rawinput on 1 Thanks, but ive been playing 10years with no rawinput, and csgo rawinput is not that good
|
On August 03 2016 04:25 weqn wrote:Show nested quote +On August 02 2016 10:43 bduddy wrote:On July 31 2016 22:16 weqn wrote:On July 28 2016 19:33 Bacillus wrote: I might take a bit of a look out of curiosity later on, I probably should know these things. The initial gut feeling is that the inaccuracy here is negligible compared to all the other inaccuracies you experience due to hardware, mouse surface, drivers and rest of the CS:GO input handling (assuming it actually exists in the first place). Do you know any way to test both performances in order to prove that there is a difference? Find a friend. Explain it to him. Have him set it up one way, and not tell you which one. Test it out, and report your findings. Then have him switch the settings, and test it again. This isn't perfect, because he knows the difference - to minimize any possible effects, minimize any kind of contact with your friend during the experiment (talking to him, etc.). And stop triple posting... I did that already, but that is trash to valve, they think its a "montage" I do not know how to prove it scientifically. It is something that you feel the mouse movement with respect to mouse sensitivity . I'm not the only one that I share this opinion , many others who tested also able to identify the difference , so we are players and not programmers, so do not have the necessary knowledge to prove scientifically ; or in some way that makes it clear the difference in the mouse movement speed people say: 0.022 and 0.022000 are the same number ; i agree , but 0.022000 feels faster than 0.022 what is the explanation for this? what may be causing this difference in mathematically correct values ? this is what has to be proven , but how? WTF do you mean by "montage"?
This is probably why you're seeing an effect: https://en.wikipedia.org/wiki/Placebo#Effects
|
Montage = Video This difference is something that, even you can visually identify with some ease , if you are familiar with the game. Is that placebo effect also? or or eye problems? This is not placebo;
|
|
does anyone know how to check this at runtime?
|
|
|
|
|
|