diff --git a/book/0_background_information.html b/book/0_background_information.html index 25b2f66..225063f 100644 --- a/book/0_background_information.html +++ b/book/0_background_information.html @@ -153,8 +153,8 @@

Before we go into the details about Futures in Rust, let's take a quick look at the alternatives for handling concurrent programming in general and some pros and cons for each of them.

-

While we do that we'll get some information on concurrency which will make it -easier for us when we dive in to Futures specifically.

+

While we do that we'll also explain some aspects when it comes to concurrency which +will make it easier for us when we dive in to Futures specifically.

For fun, I've added a small snipped of runnable code with most of the examples. If you're like me, things get way more interesting then and maybe you'll se some @@ -210,7 +210,7 @@ fn main() {

OS threads sure has some pretty big advantages. So why all this talk about "async" and concurrency in the first place?

-

First of all. For computers to be efficient it needs to multitask. Once you +

First of all. For computers to be efficient they needs to multitask. Once you start to look under the covers (like how an operating system works) you'll see concurrency everywhere. It's very fundamental in everything we do.

Secondly, we have the web.

@@ -218,10 +218,9 @@ you'll see concurrency everywhere. It's very fundamental in everything we do.

-

This gets even more relevant when the load is variable -which means the current number of tasks a program has at any point in time is -unpredictable. That's why you'll see so many async web frameworks and database -drivers today.

+

This gets even more problematic when the load is variable which means the current number of tasks a +program has at any point in time is unpredictable. That's why you'll see so many async web +frameworks and database drivers today.

However, for a huge number of problems, the standard OS threads will often be the right solution. So, just think twice about your problem before you reach for an async library.

@@ -237,15 +236,16 @@ such a system) which then continues running a different task.

Rust had green threads once, but they were removed before it hit 1.0. The state of execution is stored in each stack so in such a solution there would be no need for async, await, Futures or Pin.

-

The typical flow will be like this:

+

The typical flow looks like this:

    -
  1. Run som non-blocking code
  2. +
  3. Run some non-blocking code
  4. Make a blocking call to some external resource
  5. CPU jumps to the "main" thread which schedules a different thread to run and "jumps" to that stack
  6. Run some non-blocking code on the new thread until a new blocking call or the task is finished
  7. -
  8. "jumps" back to the "main" thread, schedule a new thread to run and jump to that
  9. +
  10. "jumps" back to the "main" thread, schedule a new thread which is ready to make +progress and jump to that.

These "jumps" are know as context switches. Your OS is doing it many times each second as you read this.

@@ -501,16 +501,17 @@ in life, close your eyes now and scroll down for 2-3 seconds. You'll find a link there that takes you to safety.

The whole idea behind a callback based approach is to save a pointer to a set of -instructions we want to run later. We can save that pointer on the stack before -we yield control to the runtime, or in some sort of collection as we do below.

-

The basic idea of not involving threads as a primary way to achieve concurrency +instructions we want to run later together with whatever state is needed. In rust this +would be a closure. In the example below, we save this information in a HashMap +but it's not the only option.

+

The basic idea of not involving threads as a primary way to achieve concurrency is the common denominator for the rest of the approaches. Including the one Rust uses today which we'll soon get to.

Advantages:

Drawbacks: