In today’s digital landscape, ensuring the security and performance of web applications is paramount. To achieve optimal protection against cyber threats, organizations deploy web applications and API protections (WAAPs) such as Wallarm. However, to really reap the benefits of Wallarm, deploying filter nodes closest to the customer using Amazon’s global infrastructure, including EC2 instances, Route 53, CloudFront, and Lambda functions, significantly improve performance. In this blog post, we explore the performance benefits of this strategic deployment approach and explore how it leverages Amazon’s robust services. While this article focuses on Amazon, the same design principles can be applied with any cloud provider.
Efficient traffic routing with Route 53
Amazon Route 53, Amazon’s scalable and highly available DNS service, plays a critical role in optimizing the performance of Wallarm filter nodes. By leveraging Route 53’s intelligent traffic routing capabilities, organizations can route customer requests to the nearest Wallarm filter node. This intelligent routing reduces latency and ensures efficient use of resources, resulting in better performance.
Accelerated content delivery with CloudFront
Amazon CloudFront, a global content delivery network (CDN), improves the performance and availability of web applications. By integrating Wallarm filter nodes with CloudFront, organizations can distribute content and filter traffic closer to end users worldwide. This integration provides faster response times, reduced load on the original servers, and improved overall application performance.
Scalable security with Lambda functions
Amazon Lambda, a serverless computing service, provides a powerful tool to improve Wallarm’s performance. Using Lambda functions, organizations can dynamically scale the Wallarm filter nodes based on demand. This elasticity ensures that the filter capacity is matched to the needs of the application, optimizing performance during busy periods and minimizing costs during low traffic periods.
Worldwide coverage for enhanced security
Deploying Wallarm filter nodes across Amazon’s global infrastructure provides organizations with comprehensive web application security coverage. With a presence in multiple regions, organizations can achieve a distributed and redundant architecture that protects against single points of failure and ensures uninterrupted protection against evolving threats. This global coverage also contributes to improved performance by reducing the distance between clients and filter nodes.
Latency Reduction with CloudFront and Localized EC2 Instances
Amazon Elastic Compute Cloud (EC2) instances provide scalable compute resources around the world. By deploying Wallarm filter nodes across Amazon regions, organizations can significantly reduce latency without compromising security. This approach ensures that traffic is processed and filtered closest to the client, minimizing round-trip time and improving overall application performance.
In the ideal baseline, the client connection is in the same region as the entire application or API packet route.
Your CDN solution, in this case CloudFront, optimizes connections and caches frequently used content, improving the overall customer experience. Typically, however, the computing power for applications and APIs is not in the same region as the client. As a result, latency is typically doubled. Even if it is deployed in the same infrastructure as the application or API. Since the packets must be mirrored for analysis, the destination of that traffic directly affects the overall transit time of the packet.
A Wallarm Security Edge implementation brings the initial traffic analysis and real-time protection closest to the customer, in the same way as your CDN solution.
When end-to-end encryption is required, the added latency of the TLS operations must also be considered. We typically see an added latency of 8-10 ms when TLS is added to all connections.
The table below summarizes the test results we observed in the scenarios outlined above across Amazon’s global infrastructure.
|Regional location||Traffic route||Avg. latency|
|Application and customer in the same region||Baseline without CloudFront lambda@edge||5-6 ms|
|Filter nodes in region with application and client||Wall arm with CloudFront lambda@edge||+5-10ms|
|Filter nodes in region with application and client + TLS||Wall arm with CloudFront lambda@edge||+13-20ms|
|Application outside the region with the client||Baseline without CloudFront lambda@edge||50-60ms|
|Filter nodes, application, and out-of-region with the customer||Wall arm with CloudFront lambda@edge||+50-60ms|
|Filter nodes, application, and out of region with client + TLS||Wall arm with CloudFront lambda@edge||+58-70 ms|
|Filter nodes in region with client||Wall arm with CloudFront lambda@edge||+5-10ms|
|Filter nodes in region with client + TLS||Wall arm with CloudFront lambda@edge||+13-20ms|
In today’s fast-paced digital world, it is critical to optimize web application performance while ensuring robust security. By deploying Wallarm filter nodes closest to the customer using Amazon’s global infrastructure, organizations can gain significant performance benefits. Leveraging EC2 instances, Route 53, CloudFront, and Lambda features enables reduced latency, efficient traffic routing, accelerated content delivery, and scalable security. This strategic approach not only improves application performance, but also provides organizations with comprehensive security coverage in a rapidly evolving threat landscape. By leveraging the power of services from Wallarm and Amazon, organizations can create a secure and high-performing web application ecosystem.
The post Maximizing Performance with Wallarm Filter Nodes in Amazon’s Global Infrastructure appeared first on Wallarm.
*** This is a syndicated blog from Wallarm’s Security Bloggers Network, written by wlrmblog. Read the original post at: https://lab.wallarm.com/maximizing-performance-with-wallarm-filtering-nodes-in-amazons-global-infrastructure/