Day 4 of 5
⏱ ~60 minutes
Scala in 5 Days — Day 4

Futures, Akka & Concurrency

Scala's Future provides asynchronous programming. Akka (now Apache Pekko) provides actor-based concurrency at massive scale. Today covers both — and how to decide which model to use.

Futures and Promises

Future[A] represents an asynchronous computation that will eventually produce an A (or fail). Create with Future { heavyComputation() } (needs an implicit ExecutionContext). Chain with map, flatMap, recover, onComplete. Await.result(future, timeout) blocks for a result (use only in tests/REPL). Promise[A] is the producer side — complete it with success or failure from another thread. Futures compose with for-comprehensions.

Akka Actors

Actors are lightweight concurrent objects that process messages from a mailbox sequentially. No shared state — actors communicate only by message. ActorSystem is the container. Props creates actor configuration. actorRef ! message sends a message (fire-and-forget). ask (?) sends a message and returns a Future. Actors handle failures with supervision strategies. Akka Typed (Akka 2.6+) adds compile-time message type safety.

Reactive Streams with Akka Streams

Akka Streams implements the Reactive Streams specification for backpressure-aware stream processing. Source (0+ elements), Flow (transform), Sink (consume). Elements flow downstream; demand signals flow upstream, preventing fast producers from overwhelming slow consumers. Akka Streams is the engine behind Alpakka (connectors for Kafka, S3, databases) and the foundation of Akka HTTP.

scala
import scala.concurrent._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._

// Futures: parallel computation
def fetchUser(id: Int): Future[String] =
  Future { Thread.sleep(100); s'User$id' }

def fetchOrders(userId: String): Future[List[String]] =
  Future { Thread.sleep(50); List(s'Order1 for $userId', s'Order2 for $userId') }

// Sequential via for-comprehension
val result: Future[List[String]] = for {
  user   <- fetchUser(42)
  orders <- fetchOrders(user)
} yield orders

// Parallel with zip
val parallel = fetchUser(1).zip(fetchUser(2))  // both run concurrently

// Recover from failure
val safe: Future[String] = fetchUser(-1)
  .map(u => s'Found: $u')
  .recover { case e: Exception => s'Error: ${e.getMessage}' }

// Block for result (test/REPL only)
println(Await.result(result, 5.seconds))
💡
Never call Await.result in production application code — it blocks a thread. Instead, use onComplete, map, flatMap, and recover to handle future results asynchronously. Only use Await in tests to synchronize assertions.
📝 Day 4 Exercise
Parallel Data Fetching with Futures
  1. Write a function that fetches 5 'database records' in parallel using Future.sequence
  2. Chain the results: fetch users, then for each user fetch their orders
  3. Compare parallel vs sequential execution time with System.currentTimeMillis()
  4. Add error recovery: if any fetch fails, return a default value instead of failing the whole Future
  5. Write a test using Await.result with a 10-second timeout to verify results

Day 4 Summary

  • Future[A] represents an async computation; chain with map/flatMap/recover
  • ExecutionContext provides the thread pool that executes Futures
  • Akka Actors communicate via message passing with sequential mailbox processing
  • Akka Streams adds backpressure to prevent slow consumers from being overwhelmed
  • Await.result blocks — use only in tests, never in production code
Challenge

Build a parallel web scraper using Futures: fetch 10 URLs concurrently, extract the page title from each, handle failures gracefully, and return a Map[URL, Option[String]] where None indicates a failed fetch.

Finished this lesson?