diff --git a/src/0_background_information.md b/src/0_background_information.md index 5fcc439..22ade35 100644 --- a/src/0_background_information.md +++ b/src/0_background_information.md @@ -72,12 +72,12 @@ First of all. For computers to be [_efficient_](https://en.wikipedia.org/wiki/Ef start to look under the covers (like [how an operating system works](https://os.phil-opp.com/async-await/)) you'll see concurrency everywhere. It's very fundamental in everything we do. -Secondly, we have the web. +Secondly, we have the web. Webservers is all about I/O and handling small tasks (requests). When the number of small tasks is large it's not a good fit for OS threads as of today because of the memory they require and the overhead involved -when creating new threads. +when creating new threads. This gets even more problematic when the load is variable which means the current number of tasks a program has at any point in time is unpredictable. That's why you'll see so many async web @@ -102,7 +102,7 @@ such a system) which then continues running a different task. Rust had green threads once, but they were removed before it hit 1.0. The state of execution is stored in each stack so in such a solution there would be no -need for `async`, `await`, `Futures` or `Pin`. +need for `async`, `await`, `Future`s or `Pin`. **The typical flow looks like this:** @@ -145,27 +145,27 @@ A green threads example could look something like this: > It's not in any way meant to showcase "best practice". Just so we're on > the same page. -_**Press the expand icon in the top right corner to show the example code.**_ +_**Press the expand icon in the top right corner to show the example code.**_ ```rust, edition2018 # #![feature(asm, naked_functions)] # use std::ptr; -# +# # const DEFAULT_STACK_SIZE: usize = 1024 * 1024 * 2; # const MAX_THREADS: usize = 4; # static mut RUNTIME: usize = 0; -# +# # pub struct Runtime { # threads: Vec, # current: usize, # } -# +# # #[derive(PartialEq, Eq, Debug)] # enum State { # Available, # Running, # Ready, # } -# +# # struct Thread { # id: usize, # stack: Vec, @@ -173,7 +173,7 @@ _**Press the expand icon in the top right corner to show the example code.**_ # state: State, # task: Option>, # } -# +# # #[derive(Debug, Default)] # #[repr(C)] # struct ThreadContext { @@ -186,7 +186,7 @@ _**Press the expand icon in the top right corner to show the example code.**_ # rbp: u64, # thread_ptr: u64, # } -# +# # impl Thread { # fn new(id: usize) -> Self { # Thread { @@ -198,7 +198,7 @@ _**Press the expand icon in the top right corner to show the example code.**_ # } # } # } -# +# # impl Runtime { # pub fn new() -> Self { # let base_thread = Thread { @@ -208,37 +208,37 @@ _**Press the expand icon in the top right corner to show the example code.**_ # state: State::Running, # task: None, # }; -# +# # let mut threads = vec![base_thread]; # threads[0].ctx.thread_ptr = &threads[0] as *const Thread as u64; # let mut available_threads: Vec = (1..MAX_THREADS).map(|i| Thread::new(i)).collect(); # threads.append(&mut available_threads); -# +# # Runtime { # threads, # current: 0, # } # } -# +# # pub fn init(&self) { # unsafe { # let r_ptr: *const Runtime = self; # RUNTIME = r_ptr as usize; # } # } -# +# # pub fn run(&mut self) -> ! { # while self.t_yield() {} # std::process::exit(0); # } -# +# # fn t_return(&mut self) { # if self.current != 0 { # self.threads[self.current].state = State::Available; # self.t_yield(); # } # } -# +# # fn t_yield(&mut self) -> bool { # let mut pos = self.current; # while self.threads[pos].state != State::Ready { @@ -250,21 +250,21 @@ _**Press the expand icon in the top right corner to show the example code.**_ # return false; # } # } -# +# # if self.threads[self.current].state != State::Available { # self.threads[self.current].state = State::Ready; # } -# +# # self.threads[pos].state = State::Running; # let old_pos = self.current; # self.current = pos; -# +# # unsafe { # switch(&mut self.threads[old_pos].ctx, &self.threads[pos].ctx); # } # true # } -# +# # pub fn spawn(f: F){ # unsafe { # let rt_ptr = RUNTIME as *mut Runtime; @@ -273,7 +273,7 @@ _**Press the expand icon in the top right corner to show the example code.**_ # .iter_mut() # .find(|t| t.state == State::Available) # .expect("no available thread."); -# +# # let size = available.stack.len(); # let s_ptr = available.stack.as_mut_ptr(); # available.task = Some(Box::new(f)); @@ -285,14 +285,14 @@ _**Press the expand icon in the top right corner to show the example code.**_ # } # } # } -# +# # fn call(thread: u64) { # let thread = unsafe { &*(thread as *const Thread) }; # if let Some(f) = &thread.task { # f(); # } # } -# +# # #[naked] # fn guard() { # unsafe { @@ -302,14 +302,14 @@ _**Press the expand icon in the top right corner to show the example code.**_ # rt.t_return(); # }; # } -# +# # pub fn yield_thread() { # unsafe { # let rt_ptr = RUNTIME as *mut Runtime; # (*rt_ptr).t_yield(); # }; # } -# +# # #[naked] # #[inline(never)] # unsafe fn switch(old: *mut ThreadContext, new: *const ThreadContext) { @@ -321,7 +321,7 @@ _**Press the expand icon in the top right corner to show the example code.**_ # mov %r12, 0x20($0) # mov %rbx, 0x28($0) # mov %rbp, 0x30($0) -# +# # mov 0x00($1), %rsp # mov 0x08($1), %r15 # mov 0x10($1), %r14 @@ -366,7 +366,7 @@ the same. You can always go back and read the book which explains it later. ## Callback based approaches You probably already know what we're going to talk about in the next paragraphs -from Javascript which I assume most know. +from Javascript which I assume most know. >If your exposure to Javascript callbacks has given you any sorts of PTSD earlier in life, close your eyes now and scroll down for 2-3 seconds. You'll find a link @@ -482,11 +482,11 @@ as timers but could represent any kind of resource that we'll have to wait for. You might start to wonder by now, when are we going to talk about Futures? -Well, we're getting there. You see `promises`, `futures` and other names for -deferred computations are often used interchangeably. +Well, we're getting there. You see `Promise`s, `Future`s and other names for +deferred computations are often used interchangeably. There are formal differences between them but we'll not cover that here but it's -worth explaining `promises` a bit since they're widely known due to being used +worth explaining `Promise`s a bit since they're widely known due to being used in Javascript and have a lot in common with Rusts Futures. First of all, many languages has a concept of promises but I'll use the ones @@ -521,8 +521,8 @@ timer(200) ``` The change is even more substantial under the hood. You see, promises return -a state machine which can be in one of three states: `pending`, `fulfilled` or -`rejected`. +a state machine which can be in one of three states: `pending`, `fulfilled` or +`rejected`. When we call `timer(200)` in the sample above, we get back a promise in the state `pending`. @@ -540,7 +540,7 @@ async function run() { You can consider the `run` function a _pausable_ task consisting of several sub-tasks. On each "await" point it yields control to the scheduler (in this -case it's the well known Javascript event loop). +case it's the well known Javascript event loop). Once one of the sub-tasks changes state to either `fulfilled` or `rejected` the task is scheduled to continue to the next step. @@ -548,7 +548,7 @@ task is scheduled to continue to the next step. Syntactically, Rusts Futures 0.1 was a lot like the promises example above and Rusts Futures 0.3 is a lot like async/await in our last example. -Now this is also where the similarities with Rusts Futures stop. The reason we +Now this is also where the similarities with Rusts Futures stop. The reason we go through all this is to get an introduction and get into the right mindset for exploring Rusts Futures. @@ -558,6 +558,6 @@ exploring Rusts Futures. > need to be polled once before they do any work.
-
+ \ No newline at end of file +
diff --git a/src/1_futures_in_rust.md b/src/1_futures_in_rust.md index 0acfa5f..826c7ee 100644 --- a/src/1_futures_in_rust.md +++ b/src/1_futures_in_rust.md @@ -91,7 +91,7 @@ Rust is different from these languages in the sense that Rust doesn't come with a runtime for handling concurrency, so you need to use a library which provide this for you. -Quite a bit of complexity attributed to `Futures` is actually complexity rooted +Quite a bit of complexity attributed to `Future`s is actually complexity rooted in runtimes. Creating an efficient runtime is hard. Learning how to use one correctly requires quite a bit of effort as well, but @@ -114,7 +114,7 @@ on the `Future`. You can think of the former as the reactor's job, and the latter as the executors job. These two parts of a runtime interacts using the `Waker` type. -The two most popular runtimes for `Futures` as of writing this is: +The two most popular runtimes for `Future`s as of writing this is: - [async-std](https://github.com/async-rs/async-std) - [Tokio](https://github.com/tokio-rs/tokio) @@ -138,17 +138,17 @@ take a look at this async block using pseudo-rust as example: ```rust, ignore let non_leaf = async { let mut stream = TcpStream::connect("127.0.0.1:3000").await.unwrap(); // <-- yield - + // request a large dataset let result = stream.write(get_dataset_request).await.unwrap(); // <-- yield - + // wait for the dataset let mut response = vec![]; stream.read(&mut response).await.unwrap(); // <-- yield // do some CPU-intensive analysis on the dataset let report = analyzer::analyze_data(response).unwrap(); - + // send the results back stream.write(report).await.unwrap(); // <-- yield }; @@ -189,16 +189,16 @@ can either perform CPU-intensive tasks or "blocking" tasks which is not supporte by the runtime. Now, armed with this knowledge you are already on a good way for understanding -Futures, but we're not gonna stop yet, there is lots of details to cover. +Futures, but we're not gonna stop yet, there is lots of details to cover. Take a break or a cup of coffe and get ready as we go for a deep dive in the next chapters. ## Bonus section If you find the concepts of concurrency and async programming confusing in -general, I know where you're coming from and I have written some resources to -try to give a high level overview that will make it easier to learn Rusts -`Futures` afterwards: +general, I know where you're coming from and I have written some resources to +try to give a high level overview that will make it easier to learn Rusts +`Future`s afterwards: * [Async Basics - The difference between concurrency and parallelism](https://cfsamson.github.io/book-exploring-async-basics/1_concurrent_vs_parallel.html) * [Async Basics - Async history](https://cfsamson.github.io/book-exploring-async-basics/2_async_history.html) @@ -206,7 +206,7 @@ try to give a high level overview that will make it easier to learn Rusts * [Async Basics - Epoll, Kqueue and IOCP](https://cfsamson.github.io/book-exploring-async-basics/6_epoll_kqueue_iocp.html) Learning these concepts by studying futures is making it much harder than -it needs to be, so go on and read these chapters if you feel a bit unsure. +it needs to be, so go on and read these chapters if you feel a bit unsure. I'll be right here when you're back. @@ -215,4 +215,4 @@ However, if you feel that you have the basics covered, then let's get moving! [async_std]: https://github.com/async-rs/async-std [tokio]: https://github.com/tokio-rs/tokio [compat_info]: https://rust-lang.github.io/futures-rs/blog/2019/04/18/compatibility-layer.html -[futures_rs]: https://github.com/rust-lang/futures-rs \ No newline at end of file +[futures_rs]: https://github.com/rust-lang/futures-rs diff --git a/src/2_waker_context.md b/src/2_waker_context.md index d063e17..61613c6 100644 --- a/src/2_waker_context.md +++ b/src/2_waker_context.md @@ -32,12 +32,12 @@ task-local storage and provide space for debugging hooks in later iterations. ## Understanding the `Waker` -One of the most confusing things we encounter when implementing our own `Futures` +One of the most confusing things we encounter when implementing our own `Future`s is how we implement a `Waker` . Creating a `Waker` involves creating a `vtable` which allows us to use dynamic dispatch to call methods on a _type erased_ trait object we construct our selves. ->If you want to know more about dynamic dispatch in Rust I can recommend an +>If you want to know more about dynamic dispatch in Rust I can recommend an article written by Adam Schwalm called [Exploring Dynamic Dispatch in Rust](https://alschwalm.com/blog/static/2017/03/07/exploring-dynamic-dispatch-in-rust/). Let's explain this a bit more in detail. @@ -46,7 +46,7 @@ Let's explain this a bit more in detail. To get a better understanding of how we implement the `Waker` in Rust, we need to take a step back and talk about some fundamentals. Let's start by taking a -look at the size of some different pointer types in Rust. +look at the size of some different pointer types in Rust. Run the following code _(You'll have to press "play" to see the output)_: @@ -177,4 +177,4 @@ use purely global functions and state, or any other way you wish. This leaves a lot of options on the table for runtime implementors. -[rfc2592]:https://github.com/rust-lang/rfcs/blob/master/text/2592-futures.md#waking-up \ No newline at end of file +[rfc2592]:https://github.com/rust-lang/rfcs/blob/master/text/2592-futures.md#waking-up diff --git a/src/3_generators_async_await.md b/src/3_generators_async_await.md index cb4f70e..7c2aea0 100644 --- a/src/3_generators_async_await.md +++ b/src/3_generators_async_await.md @@ -6,7 +6,7 @@ >- See first hand why we need `Pin` >- Understand what makes Rusts async model very memory efficient > ->The motivation for `Generators` can be found in [RFC#2033][rfc2033]. It's very +>The motivation for `Generator`s can be found in [RFC#2033][rfc2033]. It's very >well written and I can recommend reading through it (it talks as much about >async/await as it does about generators). @@ -38,7 +38,7 @@ coroutines which Rust uses today. ### Combinators -`Futures 0.1` used combinators. If you've worked with `Promises` in JavaScript, +`Futures 0.1` used combinators. If you've worked with `Promise`s in JavaScript, you already know combinators. In Rust they look like this: ```rust,noplaypen,ignore @@ -93,7 +93,7 @@ async fn myfn() { Async in Rust is implemented using Generators. So to understand how async really works we need to understand generators first. Generators in Rust are implemented -as state machines. +as state machines. The memory footprint of a chain of computations is defined by _the largest footprint that a single step requires_. @@ -206,7 +206,7 @@ impl Generator for GeneratorA { Now that you know that the `yield` keyword in reality rewrites your code to become a state machine, you'll also know the basics of how `await` works. It's very similar. -Now, there are some limitations in our naive state machine above. What happens when you have a +Now, there are some limitations in our naive state machine above. What happens when you have a `borrow` across a `yield` point? We could forbid that, but **one of the major design goals for the async/await syntax has been @@ -247,10 +247,10 @@ Now what does our rewritten state machine look like with this example? ```rust,compile_fail # enum GeneratorState { -# Yielded(Y), +# Yielded(Y), # Complete(R), # } -# +# # trait Generator { # type Yield; # type Return; @@ -314,8 +314,8 @@ As you'll notice, this compiles just fine! ```rust enum GeneratorState { - Yielded(Y), - Complete(R), + Yielded(Y), + Complete(R), } trait Generator { @@ -348,12 +348,12 @@ impl Generator for GeneratorA { let borrowed = &to_borrow; let res = borrowed.len(); *self = GeneratorA::Yield1 {to_borrow, borrowed: std::ptr::null()}; - + // NB! And we set the pointer to reference the to_borrow string here if let GeneratorA::Yield1 {to_borrow, borrowed} = self { *borrowed = to_borrow; } - + GeneratorState::Yielded(res) } @@ -401,16 +401,16 @@ pub fn main() { }; } # enum GeneratorState { -# Yielded(Y), -# Complete(R), +# Yielded(Y), +# Complete(R), # } -# +# # trait Generator { # type Yield; # type Return; # fn resume(&mut self) -> GeneratorState; # } -# +# # enum GeneratorA { # Enter, # Yield1 { @@ -419,7 +419,7 @@ pub fn main() { # }, # Exit, # } -# +# # impl GeneratorA { # fn start() -> Self { # GeneratorA::Enter @@ -435,15 +435,15 @@ pub fn main() { # let borrowed = &to_borrow; # let res = borrowed.len(); # *self = GeneratorA::Yield1 {to_borrow, borrowed: std::ptr::null()}; -# +# # // We set the self-reference here # if let GeneratorA::Yield1 {to_borrow, borrowed} = self { # *borrowed = to_borrow; # } -# +# # GeneratorState::Yielded(res) # } -# +# # GeneratorA::Yield1 {borrowed, ..} => { # let borrowed: &String = unsafe {&**borrowed}; # println!("{} world", borrowed); @@ -482,16 +482,16 @@ pub fn main() { }; } # enum GeneratorState { -# Yielded(Y), -# Complete(R), +# Yielded(Y), +# Complete(R), # } -# +# # trait Generator { # type Yield; # type Return; # fn resume(&mut self) -> GeneratorState; # } -# +# # enum GeneratorA { # Enter, # Yield1 { @@ -500,7 +500,7 @@ pub fn main() { # }, # Exit, # } -# +# # impl GeneratorA { # fn start() -> Self { # GeneratorA::Enter @@ -516,15 +516,15 @@ pub fn main() { # let borrowed = &to_borrow; # let res = borrowed.len(); # *self = GeneratorA::Yield1 {to_borrow, borrowed: std::ptr::null()}; -# +# # // We set the self-reference here # if let GeneratorA::Yield1 {to_borrow, borrowed} = self { # *borrowed = to_borrow; # } -# +# # GeneratorState::Yielded(res) # } -# +# # GeneratorA::Yield1 {borrowed, ..} => { # let borrowed: &String = unsafe {&**borrowed}; # println!("{} world", borrowed); @@ -625,7 +625,7 @@ pub fn main() { yield borrowed.len(); println!("{} world!", borrowed); }; - + let gen2 = static || { let to_borrow = String::from("Hello"); let borrowed = &to_borrow; @@ -639,7 +639,7 @@ pub fn main() { if let GeneratorState::Yielded(n) = pinned1.as_mut().resume(()) { println!("Gen1 got value {}", n); } - + if let GeneratorState::Yielded(n) = pinned2.as_mut().resume(()) { println!("Gen2 got value {}", n); }; @@ -655,4 +655,4 @@ pub fn main() { [rfc1832]: https://github.com/rust-lang/rfcs/pull/1832 [optimizing-await]: https://tmandry.gitlab.io/blog/posts/optimizing-await-1/ [pr45337]: https://github.com/rust-lang/rust/pull/45337/files -[issue43122]: https://github.com/rust-lang/rust/issues/43122 \ No newline at end of file +[issue43122]: https://github.com/rust-lang/rust/issues/43122 diff --git a/src/4_pin.md b/src/4_pin.md index 5fb3830..679e46e 100644 --- a/src/4_pin.md +++ b/src/4_pin.md @@ -30,7 +30,7 @@ Yep, you're right, that's double negation right there. `!Unpin` means On a more serious note, I feel obliged to mention that there are valid reasons for the names that were chosen. Naming is not easy, and I considered renaming -`Unpin` and `!Unpin` in this book to make them easier to reason about. +`Unpin` and `!Unpin` in this book to make them easier to reason about. However, an experienced member of the Rust community convinced me that that there is just too many nuances and edge-cases to consider which is easily overlooked when @@ -71,11 +71,11 @@ impl Test { let self_ref: *const String = &self.a; self.b = self_ref; } - + fn a(&self) -> &str { &self.a - } - + } + fn b(&self) -> &String { unsafe {&*(self.b)} } @@ -112,7 +112,7 @@ fn main() { # a: String, # b: *const String, # } -# +# # impl Test { # fn new(txt: &str) -> Self { # let a = String::from(txt); @@ -121,17 +121,17 @@ fn main() { # b: std::ptr::null(), # } # } -# +# # // We need an `init` method to actually set our self-reference # fn init(&mut self) { # let self_ref: *const String = &self.a; # self.b = self_ref; # } -# +# # fn a(&self) -> &str { # &self.a -# } -# +# } +# # fn b(&self) -> &String { # unsafe {&*(self.b)} # } @@ -168,7 +168,7 @@ fn main() { # a: String, # b: *const String, # } -# +# # impl Test { # fn new(txt: &str) -> Self { # let a = String::from(txt); @@ -177,16 +177,16 @@ fn main() { # b: std::ptr::null(), # } # } -# +# # fn init(&mut self) { # let self_ref: *const String = &self.a; # self.b = self_ref; # } -# +# # fn a(&self) -> &str { # &self.a -# } -# +# } +# # fn b(&self) -> &String { # unsafe {&*(self.b)} # } @@ -234,7 +234,7 @@ fn main() { # a: String, # b: *const String, # } -# +# # impl Test { # fn new(txt: &str) -> Self { # let a = String::from(txt); @@ -243,16 +243,16 @@ fn main() { # b: std::ptr::null(), # } # } -# +# # fn init(&mut self) { # let self_ref: *const String = &self.a; # self.b = self_ref; # } -# +# # fn a(&self) -> &str { # &self.a -# } -# +# } +# # fn b(&self) -> &String { # unsafe {&*(self.b)} # } @@ -267,7 +267,7 @@ I created a diagram to help visualize what's going on: **Fig 1: Before and after swap** ![swap_problem](./assets/swap_problem.jpg) -As you can see this results in unwanted behavior. It's easy to get this to +As you can see this results in unwanted behavior. It's easy to get this to segfault, show UB and fail in other spectacular ways as well. ## Pinning to the stack @@ -329,7 +329,7 @@ pub fn main() { // Notice how we shadow `test1` to prevent it from beeing accessed again let mut test1 = unsafe { Pin::new_unchecked(&mut test1) }; Test::init(test1.as_mut()); - + let mut test2 = Test::new("test2"); let mut test2 = unsafe { Pin::new_unchecked(&mut test2) }; Test::init(test2.as_mut()); @@ -339,15 +339,15 @@ pub fn main() { } # use std::pin::Pin; # use std::marker::PhantomPinned; -# +# # #[derive(Debug)] # struct Test { # a: String, # b: *const String, # _marker: PhantomPinned, # } -# -# +# +# # impl Test { # fn new(txt: &str) -> Self { # let a = String::from(txt); @@ -363,11 +363,11 @@ pub fn main() { # let this = unsafe { self.get_unchecked_mut() }; # this.b = self_ptr; # } -# +# # fn a<'a>(self: Pin<&'a Self>) -> &'a str { # &self.get_ref().a # } -# +# # fn b<'a>(self: Pin<&'a Self>) -> &'a String { # unsafe { &*(self.b) } # } @@ -382,7 +382,7 @@ pub fn main() { let mut test1 = Test::new("test1"); let mut test1 = unsafe { Pin::new_unchecked(&mut test1) }; Test::init(test1.as_mut()); - + let mut test2 = Test::new("test2"); let mut test2 = unsafe { Pin::new_unchecked(&mut test2) }; Test::init(test2.as_mut()); @@ -393,15 +393,15 @@ pub fn main() { } # use std::pin::Pin; # use std::marker::PhantomPinned; -# +# # #[derive(Debug)] # struct Test { # a: String, # b: *const String, # _marker: PhantomPinned, # } -# -# +# +# # impl Test { # fn new(txt: &str) -> Self { # let a = String::from(txt); @@ -417,11 +417,11 @@ pub fn main() { # let this = unsafe { self.get_unchecked_mut() }; # this.b = self_ptr; # } -# +# # fn a<'a>(self: Pin<&'a Self>) -> &'a str { # &self.get_ref().a # } -# +# # fn b<'a>(self: Pin<&'a Self>) -> &'a String { # unsafe { &*(self.b) } # } @@ -434,9 +434,9 @@ us from swapping the pinned pointers. > It's important to note that stack pinning will always depend on the current > stack frame we're in, so we can't create a self referential object in one > stack frame and return it since any pointers we take to "self" is invalidated. -> +> > It also puts a lot of responsibility in your hands if you pin a value to the -> stack. A mistake that is easy to make is, forgetting to shadow the original variable +> stack. A mistake that is easy to make is, forgetting to shadow the original variable > since you could drop the pinned pointer and access the old value > after it's initialized like this: > @@ -446,7 +446,7 @@ us from swapping the pinned pointers. > let mut test1_pin = unsafe { Pin::new_unchecked(&mut test1) }; > Test::init(test1_pin.as_mut()); > drop(test1_pin); -> +> > let mut test2 = Test::new("test2"); > mem::swap(&mut test1, &mut test2); > println!("Not self referential anymore: {:?}", test1.b); @@ -454,15 +454,15 @@ us from swapping the pinned pointers. > # use std::pin::Pin; > # use std::marker::PhantomPinned; > # use std::mem; -> # +> # > # #[derive(Debug)] > # struct Test { > # a: String, > # b: *const String, > # _marker: PhantomPinned, > # } -> # -> # +> # +> # > # impl Test { > # fn new(txt: &str) -> Self { > # let a = String::from(txt); @@ -478,11 +478,11 @@ us from swapping the pinned pointers. > # let this = unsafe { self.get_unchecked_mut() }; > # this.b = self_ptr; > # } -> # +> # > # fn a<'a>(self: Pin<&'a Self>) -> &'a str { > # &self.get_ref().a > # } -> # +> # > # fn b<'a>(self: Pin<&'a Self>) -> &'a String { > # unsafe { &*(self.b) } > # } @@ -564,7 +564,7 @@ code. certain operations on this value. 1. Most standard library types implement `Unpin`. The same goes for most -"normal" types you encounter in Rust. `Futures` and `Generators` are two +"normal" types you encounter in Rust. `Future`s and `Generator`s are two exceptions. 5. The main use case for `Pin` is to allow self referential types, the whole @@ -585,8 +585,8 @@ by adding `std::marker::PhantomPinned` to your type on stable. 10. Pinning a `!Unpin` pointer to the heap does not require `unsafe`. There is a shortcut for doing this using `Box::pin`. -> Unsafe code does not mean it's literally "unsafe", it only relieves the -> guarantees you normally get from the compiler. An `unsafe` implementation can +> Unsafe code does not mean it's literally "unsafe", it only relieves the +> guarantees you normally get from the compiler. An `unsafe` implementation can > be perfectly safe to do, but you have no safety net. @@ -605,7 +605,7 @@ extra care must be taken when implementing `Drop` for pinned types. ## Putting it all together -This is exactly what we'll do when we implement our own `Futures` stay tuned, +This is exactly what we'll do when we implement our own `Future`s stay tuned, we're soon finished. [rfc2349]: https://github.com/rust-lang/rfcs/blob/master/text/2349-pin.md @@ -638,7 +638,7 @@ pub fn main() { let mut pinned1 = Box::pin(gen1); let mut pinned2 = Box::pin(gen2); - // Uncomment these if you think it's safe to pin the values to the stack instead + // Uncomment these if you think it's safe to pin the values to the stack instead // (it is in this case). Remember to comment out the two previous lines first. //let mut pinned1 = unsafe { Pin::new_unchecked(&mut gen1) }; //let mut pinned2 = unsafe { Pin::new_unchecked(&mut gen2) }; @@ -646,7 +646,7 @@ pub fn main() { if let GeneratorState::Yielded(n) = pinned1.as_mut().resume() { println!("Gen1 got value {}", n); } - + if let GeneratorState::Yielded(n) = pinned2.as_mut().resume() { println!("Gen2 got value {}", n); }; @@ -661,8 +661,8 @@ pub fn main() { } enum GeneratorState { - Yielded(Y), - Complete(R), + Yielded(Y), + Complete(R), } trait Generator { @@ -739,4 +739,4 @@ they did their unsafe implementation. Hopefully, after this you'll have an idea of what happens when you use the `yield` or `await` keywords inside an async function, and why we need `Pin` if -we want to be able to safely borrow across `yield/await` points. \ No newline at end of file +we want to be able to safely borrow across `yield/await` points. diff --git a/src/6_future_example.md b/src/6_future_example.md index d654e0d..2b5a7fd 100644 --- a/src/6_future_example.md +++ b/src/6_future_example.md @@ -1,6 +1,6 @@ # Futures in Rust -We'll create our own `Futures` together with a fake reactor and a simple +We'll create our own `Future`s together with a fake reactor and a simple executor which allows you to edit, run an play around with the code right here in your browser. @@ -51,8 +51,8 @@ a `Future` has resolved and should be polled again. fn block_on(mut future: F) -> F::Output { // the first thing we do is to construct a `Waker` which we'll pass on to - // the `reactor` so it can wake us up when an event is ready. - let mywaker = Arc::new(MyWaker{ thread: thread::current() }); + // the `reactor` so it can wake us up when an event is ready. + let mywaker = Arc::new(MyWaker{ thread: thread::current() }); let waker = waker_into_waker(Arc::into_raw(mywaker)); // The context struct is just a wrapper for a `Waker` object. Maybe in the @@ -70,7 +70,7 @@ fn block_on(mut future: F) -> F::Output { // an event occurs, or a thread has a "spurious wakeup" (an unexpected wakeup // that can happen for no good reason). let val = loop { - + match Future::poll(pinned, &mut cx) { // when the Future is ready we're finished @@ -88,26 +88,26 @@ In all the examples you'll see in this chapter I've chosen to comment the code extensively. I find it easier to follow along that way so I'll not repeat myself here and focus only on some important aspects that might need further explanation. -Now that you've read so much about `Generators` and `Pin` already this should +Now that you've read so much about `Generator`s and `Pin` already this should be rather easy to understand. `Future` is a state machine, every `await` point is a `yield` point. We could borrow data across `await` points and we meet the exact same challenges as we do when borrowing across `yield` points. > `Context` is just a wrapper around the `Waker`. At the time of writing this book it's nothing more. In the future it might be possible that the `Context` -object will do more than just wrapping a `Future` so having this extra +object will do more than just wrapping a `Future` so having this extra abstraction gives some flexibility. As explained in the [chapter about generators](./3_generators_pin.md), we use -`Pin` and the guarantees that give us to allow `Futures` to have self +`Pin` and the guarantees that give us to allow `Future`s to have self references. ## The `Future` implementation Futures has a well defined interface, which means they can be used across the -entire ecosystem. +entire ecosystem. -We can chain these `Futures` so that once a **leaf-future** is +We can chain these `Future`s so that once a **leaf-future** is ready we'll perform a set of operations until either the task is finished or we reach yet another **leaf-future** which we'll wait for and yield control to the scheduler. @@ -230,9 +230,9 @@ is just increasing the refcount in this case. Dropping a `Waker` is as easy as decreasing the refcount. Now, in special cases we could choose to not use an `Arc`. So this low-level method is there -to allow such cases. +to allow such cases. -Indeed, if we only used `Arc` there is no reason for us to go through all the +Indeed, if we only used `Arc` there is no reason for us to go through all the trouble of creating our own `vtable` and a `RawWaker`. We could just implement a normal trait. @@ -245,13 +245,13 @@ The reactor will often be a global resource which let's us register interests without passing around a reference. > ### Why using thread park/unpark is a bad idea for a library -> +> > It could deadlock easily since anyone could get a handle to the `executor thread` > and call park/unpark on it. -> +> > 1. A future could call `unpark` on the executor thread from a different thread > 2. Our `executor` thinks that data is ready and wakes up and polls the future -> 3. The future is not ready yet when polled, but at that exact same time the +> 3. The future is not ready yet when polled, but at that exact same time the > `Reactor` gets an event and calls `wake()` which also unparks our thread. > 4. This could happen before we go to sleep again since these processes > run in parallel. @@ -277,13 +277,13 @@ Since concurrency mostly makes sense when interacting with the outside world (or at least some peripheral), we need something to actually abstract over this interaction in an asynchronous way. -This is the `Reactors` job. Most often you'll see reactors in Rust use a library +This is the `Reactor`s job. Most often you'll see reactors in Rust use a library called [Mio][mio], which provides non blocking APIs and event notification for several platforms. The reactor will typically give you something like a `TcpStream` (or any other resource) which you'll use to create an I/O request. What you get in return is a -`Future`. +`Future`. >If our reactor did some real I/O work our `Task` in would instead be represent >a non-blocking `TcpStream` which registers interest with the global `Reactor`. @@ -434,7 +434,7 @@ which you can edit and change the way you like. # task::{Context, Poll, RawWaker, RawWakerVTable, Waker}, # thread::{self, JoinHandle}, time::{Duration, Instant} # }; -# +# fn main() { // This is just to make it easier for us to see when our Future was resolved let start = Instant::now(); @@ -442,10 +442,10 @@ fn main() { // Many runtimes create a glocal `reactor` we pass it as an argument let reactor = Reactor::new(); - // Since we'll share this between threads we wrap it in a + // Since we'll share this between threads we wrap it in a // atmically-refcounted- mutex. let reactor = Arc::new(Mutex::new(reactor)); - + // We create two tasks: // - first parameter is the `reactor` // - the second is a timeout in seconds @@ -485,7 +485,7 @@ fn main() { # // ============================= EXECUTOR ==================================== # fn block_on(mut future: F) -> F::Output { -# let mywaker = Arc::new(MyWaker{ thread: thread::current() }); +# let mywaker = Arc::new(MyWaker{ thread: thread::current() }); # let waker = waker_into_waker(Arc::into_raw(mywaker)); # let mut cx = Context::from_waker(&waker); # let val = loop { @@ -497,13 +497,13 @@ fn main() { # }; # val # } -# +# # // ====================== FUTURE IMPLEMENTATION ============================== # #[derive(Clone)] # struct MyWaker { # thread: thread::Thread, # } -# +# # #[derive(Clone)] # pub struct Task { # id: usize, @@ -511,19 +511,19 @@ fn main() { # data: u64, # is_registered: bool, # } -# +# # fn mywaker_wake(s: &MyWaker) { # let waker_ptr: *const MyWaker = s; # let waker_arc = unsafe {Arc::from_raw(waker_ptr)}; # waker_arc.thread.unpark(); # } -# +# # fn mywaker_clone(s: &MyWaker) -> RawWaker { # let arc = unsafe { Arc::from_raw(s).clone() }; # std::mem::forget(arc.clone()); // increase ref count # RawWaker::new(Arc::into_raw(arc) as *const (), &VTABLE) # } -# +# # const VTABLE: RawWakerVTable = unsafe { # RawWakerVTable::new( # |s| mywaker_clone(&*(s as *const MyWaker)), // clone @@ -532,12 +532,12 @@ fn main() { # |s| drop(Arc::from_raw(s as *const MyWaker)), // decrease refcount # ) # }; -# +# # fn waker_into_waker(s: *const MyWaker) -> Waker { # let raw_waker = RawWaker::new(s as *const (), &VTABLE); # unsafe { Waker::from_raw(raw_waker) } # } -# +# # impl Task { # fn new(reactor: Arc>, data: u64, id: usize) -> Self { # Task { @@ -548,7 +548,7 @@ fn main() { # } # } # } -# +# # impl Future for Task { # type Output = usize; # fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll { @@ -565,7 +565,7 @@ fn main() { # } # } # } -# +# # // =============================== REACTOR =================================== # struct Reactor { # dispatcher: Sender, @@ -577,7 +577,7 @@ fn main() { # Close, # Timeout(Waker, u64, usize), # } -# +# # impl Reactor { # fn new() -> Self { # let (tx, rx) = channel::(); @@ -597,34 +597,34 @@ fn main() { # rl_clone.lock().map(|mut rl| rl.push(id)).unwrap(); # waker.wake(); # }); -# +# # handles.push(event_handle); # } # } # } -# +# # for handle in handles { # handle.join().unwrap(); # } # }); -# +# # Reactor { # readylist, # dispatcher: tx, # handle: Some(handle), # } # } -# +# # fn register(&mut self, duration: u64, waker: Waker, data: usize) { # self.dispatcher # .send(Event::Timeout(waker, duration, data)) # .unwrap(); # } -# +# # fn close(&mut self) { # self.dispatcher.send(Event::Close).unwrap(); # } -# +# # fn is_ready(&self, id_to_check: usize) -> bool { # self.readylist # .lock() @@ -632,7 +632,7 @@ fn main() { # .unwrap() # } # } -# +# # impl Drop for Reactor { # fn drop(&mut self) { # self.handle.take().map(|h| h.join().unwrap()).unwrap(); @@ -654,7 +654,7 @@ The `async` keyword can be used on functions as in `async fn(...)` or on a block as in `async { ... }`. Both will turn your function, or block, into a `Future`. -These `Futures` are rather simple. Imagine our generator from a few chapters +These `Future`s are rather simple. Imagine our generator from a few chapters back. Every `await` point is like a `yield` point. Instead of `yielding` a value we pass in, we yield the result of calling `poll` on @@ -675,7 +675,7 @@ Future got 1 at time: 1.00. Future got 2 at time: 3.00. ``` -If these `Futures` were executed asynchronously we would expect to see: +If these `Future`s were executed asynchronously we would expect to see: ```ignore Future got 1 at time: 1.00. @@ -697,7 +697,7 @@ how they implement different ways of running Futures to completion. [If I were you I would read this next, and try to implement it for our example.](./conclusion.md#building-a-better-exectuor). That's actually it for now. There as probably much more to learn, this is enough -for today. +for today. I hope exploring Futures and async in general gets easier after this read and I do really hope that you do continue to explore further. @@ -710,4 +710,4 @@ Don't forget the exercises in the last chapter 😊. [playground_example]:https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=ca43dba55c6e3838c5494de45875677f [spurious_wakeup]: https://cfsamson.github.io/book-exploring-async-basics/9_3_http_module.html#bonus-section [condvar]: https://doc.rust-lang.org/stable/std/sync/struct.Condvar.html -[crossbeam_parker]: https://docs.rs/crossbeam/0.7.3/crossbeam/sync/struct.Parker.html \ No newline at end of file +[crossbeam_parker]: https://docs.rs/crossbeam/0.7.3/crossbeam/sync/struct.Parker.html diff --git a/src/introduction.md b/src/introduction.md index 11e3bf3..209ea77 100644 --- a/src/introduction.md +++ b/src/introduction.md @@ -1,6 +1,6 @@ # Futures Explained in 200 Lines of Rust -This book aims to explain `Futures` in Rust using an example driven approach, +This book aims to explain `Future`s in Rust using an example driven approach, exploring why they're designed the way they are, and how they work. We'll also take a look at some of the alternatives we have when dealing with concurrency in programming. @@ -31,7 +31,7 @@ I've limited myself to a 200 line main example (hence the title) to limit the scope and introduce an example that can easily be explored further. However, there is a lot to digest and it's not what I would call easy, but we'll -take everything step by step so get a cup of tea and relax. +take everything step by step so get a cup of tea and relax. I hope you enjoy the ride. @@ -57,7 +57,7 @@ in Rust. If you like it, you might want to check out the others as well: ## Credits and thanks I'd like to take this chance to thank the people behind `mio`, `tokio`, -`async_std`, `Futures`, `libc`, `crossbeam` which underpins so much of the +`async_std`, `Future`s, `libc`, `crossbeam` which underpins so much of the async ecosystem and and rarely gets enough praise in my eyes. A special thanks to [jonhoo](https://twitter.com/jonhoo) who was kind enough to @@ -66,4 +66,4 @@ read the finished product, but a big thanks is definitely due. [mdbook]: https://github.com/rust-lang/mdBook [book_repo]: https://github.com/cfsamson/books-futures-explained -[example_repo]: https://github.com/cfsamson/examples-futures \ No newline at end of file +[example_repo]: https://github.com/cfsamson/examples-futures