Threadsafe

1+1 = 1

If a sprintf function use a global buffer to create the string it will
not work well when two threads use it at the same time. They will write
to the same data area and the result string returned will depend on what
the other thread is doing. Handling access to shared resources is the
basic problem of multi-threaded programming and code that is designed to
handle that is called thread safe.

Code is thread safe if:

All the shared-data between the threads is read-only.

Shared-data that can be written is synchronized so only on thread can interact with it at the same time.

Access to shared-data is synchronized by using synchronization primitives. To use some shared-data lock it's synchronization primitive, do the work on the shared data and then unlock the primitive. If any other thread try to acquire the lock when some thread 'hold the lock' then the tread that try to acquire it will stall and wait in the lock function. It will stay there until it gets the lock.

Lock-Free Programming

Lock-free programming is not free from locks, it's goal is to avoid 'holding the lock'.

Race conditionWhen the outcome of a thread can be changed depending on the timing of other threads there is a race condition.

DeadlocksA deadlock is when two or more threads block each other from progressing by having a lock on a shared resource. Say there are a two threads (T1 and T2) and T1 try to move something from A to B and the T2 from B to A. T1 get a lock on A and pick out the item and at the same time T2 get a lock on B and pick up his item. When T1 try to get a lock on B it has to wait and at the same time T2 try to lock A and also starts to wait. They will now both wait until the power of the computer runs out.

Resource starvationIf a thread never gets the resources it needs it will stall and fail to progress. This can happen if a thread T1 wish to get access to a resources but other threads with higher priority always takes it first.

False Sharing

False sharing is a cache problem when two or more threads use data close to each other that are on the same cache line. As one thread change the data it will be synced with the other thread. That will slow things down even if they do not use the same data.