The outstanding network data processing ability of the WedgeOS is accomplished by the patented SubSonic Engine. Designed from the ground-up to deliver low latency, high concurrency, and Gigabit Throughput for Deep Content Inspection of all the commonly used application layer network protocols and data compression formats, the SubSonic Engine separates the WedgeOS platform from its competitors to be the most accurate and highest performance web security solution.
The SubSonic Engine contains a set of architecture components that work in tandem with delivery performance: a multi-thread network data processing mechanism that scales to tens of thousands concurrent data sessions; an application content recognition module that dramatically reduces the network data processing latency; and an adaptive resource allocation algorithm that improve the overall processing performance for all the data sessions.
Multi-thread Network Data Processing
Today’s busy network junctions typically have hundreds or thousands of simultaneous network traffic sessions carried over a variety of network protocols. An effective Web Security Appliance must have the ability to sustain the required amount of connections to allow for the interception and inspection of web content. Without such ability, the end user will experience delays making the service unusable. The WedgeOS' unique multi-threaded network data processing mechanism effectively reduces system resource requirements and the context switching time for handling the individual scanning sessions, especially on the multi-CPU and/or multi-core systems. This allows the WedgeOS to sustain tens of thousands of concurrent sessions compared to the 100’s maintained by a traditional multi-process based approaches.
The scanning of a file for a malicious signature takes time and introduces latency to the end users' web experience. SubSonic Content Recognition reduces latency and ensures computing resources are available at peak scanning times by inspecting data to see if a copy of the payload has been inspected before. If yes, the previous inspection result is simply retrieved rather than applying the typically expensive content inspection algorithms.
Time Quantum Division (TQD)
During peak usage, large content scanning (file down loads, large images) can consume resources and have a negative impact on users who are accessing smaller files (web pages). The TQD algorithm manages the system resource consumption of lengthy scanning ensuring that a large file does not impact users who are accessing smaller files.
Green Stream Technology
In today’s heterogeneous networks, to keep application sessions robust is a major challenge. Green Stream technology was developed to solve the “First Packet” problem with the scanning of large files. Scanning a large file can introduce latency and increase the probability that the application may time-out. Green Stream is designed to release data packets while scanning large files, reducing the threat of an application time-out and improving the end user experience.