C – Do I need to call timer_delete remove the timer every time?

Do I need to call timer_delete remove the timer every time?… here is a solution to the problem.

Do I need to call timer_delete remove the timer every time?

I used timer_create() in the code below. It will only trigger the handler once after 10 seconds.

struct itimerspec itimer = { { 0, 0 }, { 10, 0 } };
struct sigevent si;

memset (&si, 0, sizeof (struct sigevent));
si.sigev_notify = SIGEV_THREAD;
si.sigev_notify_attributes = NULL;
si.sigev_notify_function = t;

if (timer_create (CLOCK_REALTIME, &si, &timer) < 0)
    fprintf (stderr, "[%d]: %s\n", __LINE__, strerror (errno));
    exit (errno);
if (timer_settime (timer, 0, &itimer, NULL) < 0)
    fprintf (stderr, "[%d]: %s\n", __LINE__, strerror (errno));
    exit (errno);

My question is, after 10 seconds my handler is triggered – now do I have to use timer_delete() delete timer before exiting the process? Or, because it only triggers once, doesn’t you need to explicitly remove it?


Yes, you need to explicitly remove the timer.

Look at man(2) page for timer_create timer_create all the resources in the kernel are used, and the total number of timers in the kernel at one time is limited.

If you don’t delete your timers, you’ll eventually run out of them, and any application that needs to assign timers is likely to fail.

It’s like a memory leak – clean up the resources you use or you’ll end up running out.

Respond to your follow-up questions
In the comments below, you asked if it was possible to call timer_delete from inside the callback function. I wasn’t sure how to respond, so I opened question about it myself 。 You can try experimenting with it to see if it works, but I recommend against it. I’ve never seen any code that removes a timer from a callback, and the idea of freeing up timer resources before event processing completes makes me nervous.

Also, testing it may sometimes yield good results, but since you are dealing with asynchronous events, random failures occur. Also, your main program needs to run until the callback completes (they’re running in the same process, just on different threads), so you’d better remove the timer in the main thread before exiting. I think this is a safer solution and easier to debug.

Related Problems and Solutions