uses the user-supplied time vector t for simulation. Express t in the system time units, specified in the TimeUnit property of sys. For discrete-time models, t should be of the form Ti:Ts:Tf, where Ts is the sample time. For continuous-time models, t should be of the form Ti:dt:Tf, where dt becomes the sample time of a discrete approximation to the continuous system (see Algorithms). The step command always applies the step input at t=0, regardless of Ti.