Hikers, hunters, golfers, home astronomers and others often ask themselves the question "How far is 100 feet (or any other number)?" Although finding an exact distance may not always be straightforward, obtaining clear line of sight between yourself and whatever object needs determining is usually enough to provide accurate distance estimates.
As a rule of thumb, arm length is approximately 10 times eye distance. By applying some math, this ratio can help estimate distances to objects of approximate known sizes. To do this, simply hold out one arm with elbow straight and thumb pointing up; close one eye; move thumb toward or away from distant object (say half car width away) until movement equals near object (e.g. half a car width away); multiply this estimated amount of movement by 10 for an approximate idea of its distance from you.
In two-dimensional space, you can use the Pythagorean Theorem to calculate distances. In order to do this, you'll need the coordinates of both points you are comparing; subtract their x and y coordinates respectively from one another before subtracting their respective coordinates from each other; square both quantities before taking their root value as your resultant distance measurement unit: meters is often the case here. However, for moving objects such as cars or trains which don't maintain constant speeds over time (ie, speed), divide their distance traveled over their time to find their velocity or speed and calculate this way instead.