The cell loss probability required in the ATM network is in the range of 10$^{-9}$ ∼10$^{-12}$ . If Monte Carlo simulation is used to analyze the performance of the ATM node, an enormous amount of computer time is required. To obtain large speed-up factors, importance sampling may be used. Since the Markov-modulated processes have been used to model various high-speed network traffic sources, we consider discrete time single server queueing systems with Markov-modulated arrival processes which can be used to model an ATM node. We apply importance sampling based on the Large Deviation Theory for the performance evaluation of, MMBP/D/1/K, ∑MMBP/D/1/K, and two stage tandem queueing networks with Markov-modulated arrival processes and deterministic service times. The simulation results show that the buffer overflow probabilities obtained by the importance sampling are very close to those obtained by the Monte Carlo simulation and the computer time can be reduced drastically.