Process - (Kernel) Thread (Lightweight processes - LWP)

Process States

About

A thread is a child unit of execution of a process (also known as the OS - Main Thread). Threads exist within a process. A process acts as a container for threads.

A process is a unit of resources, while a thread is a unit of:

  • scheduling
  • and execution.

Threads are also known as lightweight processes whereas a process is a “heavyweight” unit of kernel scheduling, as creating, destroying, and switching processes is relatively expensive.

Threads are effectively processes that run in the same memory context and share other resources with their parent processes, such as open files.

The threads independently execute code that operates on values and objects residing in a shared main memory.

Threads may be supported:

  • by having many hardware processors,
  • by time-slicing a single hardware processor,
  • or by time-slicing many hardware processors.

Multithreading is a widespread programming and execution model that allows multiple threads to exist within the context of a single process.

An Hardware thread is a thread of code executing on a logical core.

Both processes and threads provide an execution environment, but creating a new thread requires fewer resources than creating a new process.

Threads share the process's resources, including memory and open files. This makes for efficient, but potentially problematic, communication.

Multi Threading Puppies

Every application has at least one thread — or several (Multithreaded execution), if you count “system” threads that do things like memory management and signal handling.

A thread is a sequence of instructions that execute sequentially. If there are multiple threads, then the processor can work on one for a while, then switch to another, and so on.

In general the programmer has no control over when each thread runs; the operating system (specifically, the scheduler) makes those decisions. As a result, again, the programmer can’t tell when statements in different threads will be executed.

Although threads seem to be a small step from sequential computation, in fact, they represent a huge step. They discard the most essential and appealing properties of sequential computation:

  • understandability,
  • predictability,
  • and determinism.

Threads, as a model of computation, are wildly non-deterministic, and the job of the programmer becomes one of pruning that non-determinism.

The Problem with Threads, Edward A. Lee, UC Berkeley, 2006

Concurrency

Threads in the same process share the same address space. This allows exchange data without the overhead or complexity of an IPC.

Model

Synchronous

Synchronous I/O: Many networking libraries and frameworks rely on a simple threading strategy: each network client is being assigned a thread upon connection, and this thread deals with the client until it disconnects.

Too many concurrent connections will hurt scalability as system threads are not cheap, and under heavy loads an operating system kernel spends significant time just on thread scheduling management.

In Java, this is the case

  • with Servlet
  • or networking code written with the java.io and java.net packages

Asynchronous

See Event loop

State

See Process / Thread - State

Process

A thread is the child of a process

Example: in Process Explorer

Process Thread Relationship Process Explorer

Hyperthreading

With 48 physical cores, if you consider hyper-threading you get 96 logical cores and therefore a maximum of 96 threads.

Documentation / Reference





Discover More
Cpu Moore Law Transistor
CPU - Processor Core

Processing performance of computers is increased by using multi-core processors, which essentially is plugging two or more individual processor (called cores in this sense) into one integrated circuit....
Cpu Moore Law Transistor
Computer - Central processing unit (CPU)

A CPU is just a device name that indicate a device that controls a computer system. A CPU is also known as: a or The fundamental operation of most CPUs, regardless of the physical form they take,...
Clock Tick
Computer Clock - Clock Tick (Clock Cycle)

CPU's are marching forward at some frequency, and the period of this frequency is called a Clock Tick or Clock Cycle A 100Mhz processor will receive 100,000,000 clock ticks every second. The tick...
Data System Architecture
Conccurency - Asynchronous Model

Asynchronous allows an application to issue multiple requests and continue executing while the server performs the request. This type of request can improve an application’s throughput because it allows...
Data System Architecture
Concurrency - (Thread) Contention

Thread contention occurs when two or more threads try to access the same resource simultaneously and cause the runtime to execute one or more threads: more slowly, or even suspend their execution....
Data System Architecture
Concurrency - Concurrency

Data concurrency means that many thread (that may represents users) can access and modify data at the same time. Data concurrency ensures that users can access data at the same time ...reubenbond/status/662061791497744384/photo/1Reuben...
Data System Architecture
Concurrency - Concurrent User

The number of users logged into the system is not the number of concurrency user. You should focus on the actual users processing /querying. With the assumption that: At any one time, 10-20% of your...
Data System Architecture
Concurrency - Deadlock

Deadlock describes a situation where two or more threads are blocked forever, waiting for each other due to a lock. At some point intensive or inappropriate exclusive locking can lead to a “deadlock”...
Data System Architecture
Concurrency - Guarded block (Thread Communication Idiom)

The most common coordination idiom between thread is the guarded block. Such a block begins by polling a condition that must be true before the block can proceed. The loop is wasteful, since it...
Data System Architecture
Concurrency - Mutex (Mutual exclusion object)

A mutex is a mutual exclusion object that restricts access to a shared resource (e.g. a file) to a single thread instance. The mutual exclusion synchronization between concurrent thread means that: ...



Share this page:
Follow us:
Task Runner