This depends on a number of factors really
1) Maximum time
2) What the minimum sampling interval is
To give you an example a 3424 has a memory of 512kS, if you were to use a timebase of 50ms/div, this would be 500ms in total. The 3424 can sample at up to 20MS/s. Doing quick calculations at 50ms/division you would get a sample interval of approximately 1MS/s.
In reality you will use about 312kS samples which means your sample interval will be 1.6uS, due to the way the timebases work. So the sample interval starts at 5us, then 2.5us, then 1.25us and then 0.625 us and so on.
Hope this helps let me know if you require any clarification on any of the points I raised.