My father worked on algorithms for calculating millions of digits of pi. That's computational, but not applied, since it's hard to find applications that require even twenty digits.
I used probability theory to find a strategy for making use of imperfect information in order to sample a rare population as cheaply as possible, while meeting certain accuracy requirements. This is applied but not computational, since I didn't use a computer to help figure out this strategy.
In practice, the line is often blurred; methods developed for "pure" mathematics often turn out to have practical applications (one of my father's other hobbies is factoring integers into prime numbers...) and computers are often handy for implementing methods developed by non-computational means.