Ratio IQs were calculated by dividing mental age (as measured by the assessment intrument) by chronological age. Some of the older IQ tests used ratios or modified ratio scales to calculate this "Intelligence Quotient', the original meaning of IQ. Under this method, a 10 year old who functioned mentally like a 14.5 year old would have a 145 IQ.
Modern IQ tests use rarity on the normal distribution curve to calculate scores, with most having a standard deviation of 15 and a mean of 100. With this method, a 145 IQ would mean that only 1 in 1000 people would score that highly on the test.
Scores obtained by one method aren't equivalent to scores obtained by the other method. There are some tables out there that give general conversion ranges.