Project Loom: Understand The New Java Concurrency Model

June 17, 2022
Project Loom



Overview

Project Loom features a lightweight concurrency construct for Java. There are some prototypes already introduced in the form of Java libraries. The project is currently in the final stages of development and is planned to be released as a preview feature with JDK19. Project Loom is certainly a game-changing feature from Java so far. This new lightweight concurrency model supports high throughput and aims to make it easier for Java coders to write, debug, and maintain concurrent Java applications.

In this article, we will be looking into Project Loom and how this concurrent model works. We will be discussing the prominent parts of the model such as the virtual threads, Scheduler, Fiber class and Continuations.

What Are Virtual Threads in Java?

A thread supports the concurrent execution of instructions in modern high-level programming languages and operating systems. Each thread has a separate flow of execution, and multiple threads are used to execute different parts of a task simultaneously. Usually, it is the operating system’s job to schedule and manage threads depending on the performance of the CPU.

On the contrary, Virtual threads, also known as user threads or green threads are scheduled by the applications instead of the operating system. JVM, being the application, gets the total control over all the virtual threads and the whole scheduling process when working with Java. The virtual threads play an important role in serving concurrent requests from users and other applications.

Concurrency Vs Parallelism

Often developers confuse these two concepts. Before proceeding, it is very important to understand the difference between parallelism and concurrency. Concurrency is the process of scheduling multiple largely independent tasks on a smaller or limited number of resources. Whereas parallelism is the process of performing a task faster by using more resources such as multiple processing units. The job is broken down into multiple smaller tasks, executed simultaneously to complete it more quickly. To summarize, parallelism is about cooperating on a single task, whereas concurrency is when different tasks compete for the same resources. In Java, parallelism is done using parallel streams, and project Loom is the answer to the problem with concurrency.

Concurrency Model of Java

The whole concurrency model in Java is based on the threads. These are primarily used to make all the concurrent Java applications. However, as the traditional threads are used for the implementation, they are not enough to fulfill the concurrency requirements. There are multiple reasons behind their incompetency as follows:

  1. These threads cannot handle the level of concurrency required by applications developed nowadays. For instance, an application would easily allow up to millions of tasks execution concurrently, which is not near the number of threads handled by the operating system.
  2. The typical threads are also blocked by I/O. A thread could be blocked from continuing (suspended) if there is a delay in data to be read or written by an I/O task. This limits the working of threads and the number of threads an application can utilize. As a result, concurrent connections would also get limited due to which threads could not be appropriately scaled.
  3. Most concurrent applications developed in Java require some level of synchronization between threads for every request to work properly. It is required due to the high frequency of threads working concurrently. Hence, context switching takes place between the threads, which is an expensive task affecting the execution of the application.

To cater to these issues, the asynchronous non-blocking I/O were used. The use of asynchronous I/O allows a single thread to handle multiple concurrent connections, but it would require a rather complex code to be written to execute that. Much of this complexity is hidden from the user to make this code look simpler. Still, a different mindset was required for using asynchronous I/O as hiding the complexity cannot be a permanent solution and would also restrict users from any modifications.

Another possible solution is the use of asynchronous concurrent APIs. CompletableFuture and RxJava are quite commonly used APIs, to name a few. These APIs do not block the thread in case of a delay. Instead, it gives the application a concurrency construct over the Java threads to manage their work. One downside of this solution is that these APIs are complex, and their integration with legacy APIs is also a pretty complex process.

Eventually, a lightweight concurrency construct is direly needed that does not make use of these traditional threads that are dependent on the Operating system.

Java Project Loom

Project Loom offers a much-suited solution for such situations. It proposes that developers could be allowed to use virtual threads using traditional blocking I/O. If a virtual thread is blocked due to a delay by an I/O task, it still won’t block the thread as the virtual threads can be managed by the application instead of the operating system. This could easily eliminate scalability issues due to blocking I/O.

The best part of using virtual threads is that the process of using them is nearly the same as using traditional threads. On top of that, Project loom also introduces several new API methods for various enhancements as follows,

New methods in Thread Class

Few new methods are introduced in the Java Thread class. For instance, Thread.ofVirtual() method that returns a builder to start a virtual thread or to create a ThreadFactory. Similarly, the Executors.newVirtualThreadPerTaskExecutor() method has also been added, which can be used to create an ExecutorService that uses virtual threads. You can use these features by adding –enable-preview JVM argument during compilation and execution like in any other preview feature.

Introduction of Fiber Class

A new class named Fiber has also been introduced to the library along with the updates in Thread class. Although both classes offer quite similar user implementations, there are two main differences that would make the Fiber class a bit more useful in some cases:

a. Pluggable user-mode scheduler

Project Loom allows the use of pluggable schedulers with fiber class. In asynchronous mode, ForkJoinPool is used as the default scheduler. It works on the work-stealing algorithm so that every thread maintains a Double Ended Queue (deque) of tasks. It executes the task from its head, and any idle thread does not block while waiting for the task. Instead, the task is pulled from the tail of the deque.

The only difference in asynchronous mode is that the current working threads steal the task from the head of another deque. ForkJoinPool adds a task scheduled by another running task to the local queue. Hence, it is executed on the same CPU.

b. Internal user-mode continuation 

Fiber class would wrap the tasks in an internal user-mode continuation. This means the task will be suspended and resume in Java runtime instead of the operating system kernel. A continuation is an actual task to be performed. It consists of a sequence of instructions to be executed. Every continuation has an entry point and a yield (suspending point) point. Whenever the caller resumes the continuation after it is suspended, the control is returned to the exact point where it was suspended.

A point to be noted is this suspension or resuming occurs in the application runtime instead of the OS. As a result, it prevents the expensive context switch between kernel threads.

As the suspension of a continuation would also require it to be stored in a call stack so it can be resumed in the same order, it becomes a costly process. To cater to that, the project Loom also aims to add lightweight stack retrieval while resuming the continuation.

Future of Project Loom

Virtual threads, as the primary part of the Project loom, are currently targeted to be included in JDK 19 as a preview feature. If it gets the expected response, the preview status of the virtual threads will then be removed by the time of the release of JDK21.

The good news for early adopters and Java enthusiasts is that Java virtual threads libraries are already included in the latest early access builds of JDK 19. So, you can try it out right now. The sole purpose of this addition is to acquire constructive feedback from Java developers so that JDK developers can adapt and improve the implementation in future versions.

Conclusion

This article discusses the problems in Java’s current concurrency model and how the Java project Loom aims to change them. We also explored the tasks and schedulers in threads and how Fibers Class and pluggable user-mode schedulers can be an excellent alternative for traditional threads in Java.

Also Read: The New Java Roadmap: What’s In Store For The Future?



author

jordan

Full Stack Java Developer | Writer | Recruiter, bridging the gap between exceptional talent and opportunities, for some of the biggest Fortune 500 companies.


Candidate signup

Create a free profile and find your next great opportunity.

JOIN NOW

Employer signup

Sign up and find a perfect match for your team.

HIRE NOW

How it works

Xperti vets skilled professionals with its unique talent-matching process.

LET’S EXPLORE

Join our community

Connect and engage with technology enthusiasts.

CONNECT WITH US