Browse Clojure Foundations for Java Developers

Concurrency vs. Parallelism

What concurrency and parallelism actually mean in Clojure, why they are not the same thing, and how Java developers should choose the right tool.

Concurrency and parallelism are related, but they answer different questions. Java developers often learn both concepts around threads and executors, then carry the habit of using the terms interchangeably. In Clojure, it helps to separate them clearly because the language gives you different tools for coordination and for raw parallel execution.

The short version

  • Concurrency is about managing multiple tasks that may overlap in time.
  • Parallelism is about actually doing work at the same time on multiple cores.

A program can be concurrent without being parallel, and parallel without having complicated coordination.

What concurrency means

Concurrency is mainly a design problem. It asks: how do different activities interact without corrupting shared state or blocking each other unnecessarily?

For example:

  • a web service handles many requests in flight
  • one part of the system updates state while another reads it
  • background jobs and request handlers coexist safely

This is where Clojure’s emphasis on immutable data and explicit state tools matters most.

What parallelism means

Parallelism is mainly an execution problem. It asks: can independent work run simultaneously and finish faster by using more than one CPU core?

For example:

  • process many large values independently
  • run separate calculations at the same time
  • divide CPU-heavy tasks into chunks

This is useful only when the work is independent enough and heavy enough to justify the coordination overhead.

Why the distinction matters in Clojure

Clojure is strong at concurrency because immutable values and explicit state references reduce the usual shared-mutation hazards.

That means you often solve concurrency problems by:

  • sharing immutable values safely
  • isolating state changes behind atoms, refs, or agents
  • designing functions that compute new values instead of mutating old ones

Parallelism is a separate question. Once the data flow is safe, you can ask whether some part of the work should also run in parallel.

A practical example

This is a concurrency-oriented pattern:

1(def request-count (atom 0))
2
3(defn record-request! []
4  (swap! request-count inc))

The point here is coordinated state change. You are not trying to make the increment “parallel” in any interesting sense. You are trying to keep shared state correct.

This is a parallelism-oriented pattern:

1(doall (pmap #(* % %) (range 1 20)))

Here the point is independent computation across many values. There is no shared state problem to solve first; the question is whether multiple cores can help.

Java comparison that helps

In Java, concurrency and parallelism often get discussed together because both show up in thread APIs, executors, locks, futures, and streams.

In Clojure, a better split is:

  • concurrency: safe coordination of changing program activity
  • parallelism: simultaneous execution of independent work

That split usually leads to better design decisions.

Common mistakes

If the real issue is shared state or task coordination, you have a concurrency problem first.

Adding parallelism before fixing data flow

If your state model is messy, adding parallel execution usually makes the problem worse rather than faster.

Using parallel tools for tiny tasks

Parallelism has overhead. Small or cheap operations often run slower when parallelized.

A practical rule

Ask these questions in order:

  1. do I need safe coordination between activities?
  2. once that is solved, is there enough independent CPU-heavy work to justify parallel execution?

That order keeps the design grounded.

Knowledge Check

### What is the main difference between concurrency and parallelism? - [x] Concurrency is about coordinating overlapping tasks; parallelism is about executing work simultaneously - [ ] Concurrency always uses many CPU cores; parallelism never does - [ ] Concurrency only exists in Java; parallelism only exists in Clojure - [ ] They are two words for the same thing > **Explanation:** Concurrency focuses on structure and coordination. Parallelism focuses on doing work at the same time. ### Which example is primarily a concurrency concern rather than a parallelism concern? - [x] Safely updating shared application state with an `atom` - [ ] Running independent numeric calculations with `pmap` - [ ] Splitting a CPU-heavy workload across cores - [ ] Batch-processing independent files at the same time > **Explanation:** Coordinating shared state is a concurrency problem. The `atom` example is about correctness under overlap, not speedup from multiple cores. ### Why should a Java developer usually solve concurrency design before adding parallelism? - [x] Because unsafe state flow becomes harder to reason about once work also runs in parallel - [ ] Because Clojure forbids parallel execution until all atoms are removed - [ ] Because parallel code never improves performance - [ ] Because immutable data prevents any simultaneous work > **Explanation:** If the coordination model is unclear, parallel execution tends to amplify bugs rather than help performance.
Revised on Friday, April 24, 2026