C – OpenSSL causes packet loss? Strange CPU usage

OpenSSL causes packet loss? Strange CPU usage… here is a solution to the problem.

OpenSSL causes packet loss? Strange CPU usage

I’m writing a web application that reads packets from UDP sockets and then decrypts them using OpenSSL.

The main features are as follows:

receive(){
    while(1){
        read(udp_sock);
        decrypt_packet();
    }
}

The program worked fine until I added encryption. Now a lot of packets are missing between the kernel buffer and my application (netstat -su – RcvbufErrors: 77123 and growing; )。 The packets are quite large (60K) and I tried using it on 1Gbps Ethernet (so the problem started after more than 100Mbps).

Sounds normal – decryption takes too much time, packets come too quickly. The problem is – the CPU usage on the sender and receiver never exceeds 30%.

After commenting out this statement in decrypt_packet(), the problem disappears:
AES_ctr128_encrypt();

My question is – is it possible that OpenSSL is using some instruction set that doesn’t count towards CPU usage (I use htop and Gnome system monitor)? If not, what else could cause such packet loss and is CPU power still available for processing?

Solution

How many CPU cores does your system have? Is your code single-threaded? It may maximize a single core, so only 25% of the available CPU is used.

Related Problems and Solutions