Is there a basic mathematical relationship between IQ and learning speed? ... such as linear, exponential, etc?
For example, if some number of people with an approximate IQ of 100 (95-105) take X minutes to complete a task (lets also say plus-minus 5%), is there an expected average Y time people with an average 120 would need?
Can you cite the research?
I am very clearly asking for the relationship between the two. I'm really thrown for a loop when people make these kinds of equivocations.
– Randy Zeitman Jan 16 '19 at 03:34