relation between simulation time and real time?
What are the differences between simulation and real time? Can simulation time run faster than real time? Can simulation time run slower than real time?
Trying to get a grasp of this concept, all help is much appreciated.
Please note, I read this as kind of ambiguous, but, will attempt to answer as best as I can.
Real time is typically considered to be at the time that the code is running or as the action is happening. This can also mean that an action only occurs at a set time, and I need to be present to observe the action at that time.
Simulated time, is, as it means, only simulated. For example, I could write a method that counts every minute, but then, but in a test method that increments the time of the application by a minute every n seconds. This can also mean a time that I set arbitrarily. This could mean changing my system clock or that of my server to observe an action that only occurs at a certain time in almost real circumstances whenever I want, or, that I can arbitrarily change the time set in the code to see if I can observe a certain behavior.