Threads and Concurrency
Threads and Concurrency
1. Introduction: From Processes to Threads
Traditionally, the operating system used the process as the basic unit of execution. Each process has:
-
Its own address space
-
Its own resources
-
Its own execution context
However, modern applications (web browsers, servers, IDEs) need multiple activities to occur concurrently within the same program. Creating multiple processes for this purpose is expensive.
This leads to the concept of a thread.
2. Concept of a Thread
What Is a Thread?
A thread is the basic unit of CPU utilization within a process.
A thread consists of:
-
Program Counter (PC)
-
Register set
-
Stack
Threads share the following with other threads in the same process:
-
Code section
-
Data section
-
Heap
-
Open files and resources
👉 A process may contain one or more threads.
Single-Threaded vs Multithreaded Process
Single-threaded process
-
One execution path
-
If the thread blocks (e.g., I/O), the entire process blocks
Multithreaded process
-
Multiple execution paths within the same process
-
One thread can continue even if another is blocked
📌 Example (Web Browser):
-
One thread for UI
-
One thread for network operations
-
One thread for rendering
3. Benefits of Multithreading
There are four major benefits of multithreading:
1️⃣ Responsiveness
-
Multithreading allows an application to remain responsive even if part of it is blocked.
-
Especially important for interactive applications.
Example:
-
A word processor continues responding to user input while saving a file in the background.
2️⃣ Resource Sharing
-
Threads share memory and resources of the process.
-
Communication between threads is faster and easier than inter-process communication (IPC).
Example:
-
Multiple threads accessing the same in-memory data structure.
3️⃣ Economy
-
Creating and managing threads is cheaper than processes.
-
Lower overhead in:
-
Creation
-
Context switching
-
Termination
-
4️⃣ Scalability
-
Multithreading allows programs to take advantage of multiple CPU cores.
-
Different threads can run on different processors simultaneously.
Example:
-
A database server using separate threads for client requests on different cores.
4. Threads and Concurrency
What Is Concurrency?
Concurrency refers to:
The ability of a system to execute multiple tasks overlapping in time.
-
On a single-core system → achieved via time-sharing
-
On a multi-core system → achieved via true parallelism
Threads are the primary mechanism used by the OS to support concurrency within a process.
5. Multithreading Models
Lets categorizes threading models based on the relationship between user threads and kernel threads.
User Threads vs Kernel Threads
User Threads
-
Managed by user-level thread library
-
Kernel is unaware of them
-
Faster to create and manage
Kernel Threads
-
Managed directly by the OS
-
Kernel schedules them
-
Slower but more powerful
6. Multithreading Models
1️⃣ Many-to-One Model
Description
-
Many user threads mapped to one kernel thread
-
Thread management done in user space
Advantages
-
Fast thread creation
-
Efficient context switching
Disadvantages
-
If one thread blocks, all threads block
-
No parallelism on multi-core systems
Example
-
Early Java thread libraries
2️⃣ One-to-One Model
Description
-
Each user thread maps to a kernel thread
Advantages
-
True parallelism
-
One thread blocking does not affect others
Disadvantages
-
Overhead of kernel thread creation
-
OS may limit number of threads
Examples
-
Linux
-
Windows
-
macOS
📌 This is the most commonly used model today.
3️⃣ Many-to-Many Model
Description
-
Many user threads mapped to many kernel threads
-
OS decides how many kernel threads to create
Advantages
-
Combines flexibility of user threads
-
Allows parallelism
-
Avoids blocking problem
Disadvantages
-
Complex to implement
Example
-
Solaris (older versions)
4️⃣ Two-Level Model (Variation)
-
Similar to many-to-many
-
Allows binding specific user threads to kernel threads
7. Multithreading Issues (Brief Overview)
Major challenges:
-
Thread synchronization
-
Race conditions
-
Deadlocks
-
Data consistency
These issues arise because threads share resources.
8. Summary Table
| Aspect | Threads |
|---|---|
| Basic unit | CPU execution |
| Resource sharing | Yes (within process) |
| Creation cost | Low |
| Communication | Fast |
| Enables | Concurrency & parallelism |
9. Key Takeaways
-
A thread is a lightweight process
-
Multithreading improves performance, responsiveness, and scalability
-
Modern OSes use the one-to-one model
-
Threads are essential for concurrent and parallel programming



Comments
Post a Comment