I am just posting here to verify that I am properly using a tool that is new to me, a micrometer. Here is what the precision label specifies, basically that it can measure up to an inch and down to 1/1000" (a notch on the round scale).
I would like to describe how I was converting the reading on the linear and on the round gauges and you tell me if I did it right.
In the above picture, the linear scale is reading 2/10" and the round scale is reading 15/1000". So the total gauge I am measuring is 0.2" + 0.015 = 0.215". I am particularly curious to verify if the round gauge's each notch is indeed 1/1000" and whether I am using the correct numerator in the fraction.

