Nope. Torque (at the crankshaft) is irrelevant for acceleration. You can put a lawnmower engine through a 2000:1 reducer gear and get enough torque to move a fully loaded dump truck. The reason we don't do that is because that reducer gear increases torque at the same rate that it reduces RPM. So in getting massive torque from that lawnmower engine, you now have practically no RPM at the wheels, and your top speed is <1 MPH.
Power is the measure of how much torque you can make at a given RPM, or how many RPM you can make a given amount of torque at. Power is always going to be the same (roughly; there's friction losses involved) at both the input and the output of your gearing. So if you want to accelerate your dump truck from 20-30 MPH, there's a certain amount of RPM that's required at the output of your gearing (based on tire diameter), and a certain amount of torque required at the output of your gearing (based on mass, friction, and tire diameter). You multiply those two numbers and what you get is the required amount of power in order to accelerate your dump truck. It works just as well with a high revving engine with short gearing as it does with a low revving engine with tall gearing. In fact, the turbine engine in an M1 Abrams tank only makes about 200 ft-lbs at the turbine shaft. The quoted 1500 ft-lbs figure is at the output of an integral 7.5:1 gear reduction.
The reason heavy machinery uses low-revving diesel engines isn't because it needs crank torque to do work; it's because high revving engines are associated with high friction losses and high wear rates. An F1 engine goes about 15 hours between between between rebuilds, while my bobcat goes about 100 hours between oil changes.