Real-time data processing and decision-making are increasingly crucial in various applications, driven by the continuous influx of data streams. Event-driven AI workflows within serverless computing environments offer a promising approach to handle these real-time demands efficiently. This paper presents a framework for simulating and analyzing the performance characteristics of such workflows. Our proposed approach utilizes simulated data with varying event rates and durations to investigate the impact on key performance metrics like latency, throughput, and resource utilization. This enables a comprehensive evaluation of the inherent trade-offs within event-driven AI systems. The key findings reveal a trade-off between latency and throughput. As the event rate increases, average processing latency generally increases while average throughput increases. Resource utilization remains relatively stable across different event rates in the simulated scenarios (e.g., 75.55\% at 2 events/second, 74.51\% at 10 events/second). This framework provides a valuable tool for understanding the performance characteristics of event-driven AI workflows and optimizing resource allocation strategies.