Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Understanding Processes, Threads, Coroutines, Synchronization, Asynchronous Execution, Blocking vs Non-blocking, Concurrency, Parallelism, and Serial Execution

Tech May 10 3

Processes

A process represents an instance of a running program. It is the fundamental unit for system resource allocation, with each process maintaining its own isolated memory space. This isolation ensures that data between processes is not shared directly, which increases overhead but enhances security and stability.

To create a new process in Python, use the Process class from the multiprocessing module. The target function to execute is passed without parentheses, along with arguments via args and kwargs.

import os
from multiprocessing import Process
import time

def worker_task(label):
    for i in range(5):
        print(f'Worker {label}{i} started')
        time.sleep(2)
        print(f'Worker {label}{i} finished')

if __name__ == '__main__':
    proc = Process(target=worker_task, args=('kobe',))
    proc.start()
    print('Main process completed')

Processes do not share global variables; each has its own copy of data.

Threads

Within a single process, at least one thread exists—the main thread. Threads are the smallest units of execution within a process and share the same memory space, enabling efficient communication through shared variables.

However, this sharing can lead to race conditions or deadlocks if access to shared resources is not properly synchronized.

Creating a thread uses the Thread class from the threading module:

import threading
import time

def task():
    for i in range(5):
        print(f'Thread {i} started')
        time.sleep(2)
        print(f'Thread {i} finished')

if __name__ == '__main__':
    thread_obj = threading.Thread(target=task)
    thread_obj.start()
    print('Main thread ended')

Threads depend on their parent process and cannot exist independently.

Coroutines

Coroutines are lightweight, user-space threads managed entirely by the application rather than the OS. They offer high performance due to minimal context-switching costs and are typically implemented using libraries like gevent. A single thread can host multiple coroutines, allowing concurrent execution without the overhead of traditional threading.

Synchronous Execution

In synchronous operations, tasks execute sequentially—each must complete before the next begins. There is no overllap; control flow waits for one operation to finish before proceeding.

Asynchronous Execution

Asynchronous behavior allows multiple tasks to proceed independently. While waiting for I/O or other operations, a task can continue processing other work instead of idling. Results from one task may be retrieved later via callbacks, enabling efficient handling of concurrent operations without blocking.

Blocking Operations

A blocking operation halts the current thread until a condition is met—for example, when calling join() on a thread, the main thread pauses until the child thread finishes.

import threading
import time

def background_work():
    for i in range(5):
        print(f'Background {i} started')
        time.sleep(2)
        print(f'Background {i} finished')

if __name__ == '__main__':
    worker = threading.Thread(target=background_work)
    worker.start()
    worker.join()  # Main thread blocks here
    print('Main thread completed')

Non-blocking Operations

Non-blocking actions allow the program to continue executing immediately, even if the requested operation isn't yet complete. This avoids idle time and improves responsiveness.

Note: Synchronous/asynchronous refers to task scheduling pattterns; blocking/non-blocking describes how code execution proceeds during waits.

Concurrency

Concurrency means handling multiple tasks over time by rapidly switching between them. Although only one task runs at a time at the hardware level, the quick alternation creates the illusion of simultaneous execution. The key is the ability to manage multiple tasks efficiently, not necessarily executing them simultaneously.

Parallelism

Parallelism involves executing multiple tasks simultaneously across multiple processors or cores. Both at the micro and macro levels, tasks run concurrently. True parallelism requires multiple execution units and enables actual simultaneous processing.

Serial Execution

Serial execution performs tasks one after another, strictly in order. Each subsequent operation must wait for the previous one to complete before starting. This approach lacks overlap and is generally less efficient for I/O-heavy or compute-intensive workloads.

Tags: process

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.