Skip to content

Commit 99d4886

Browse files
committed
Auto merge of #49669 - SimonSapin:global-alloc, r=alexcrichton
Add GlobalAlloc trait + tweaks for initial stabilization This is the outcome of discussion at the Rust All Hands in Berlin. The high-level goal is stabilizing sooner rather than later the ability to [change the global allocator](#27389), as well as allocating memory without abusing `Vec::with_capacity` + `mem::forget`. Since we’re not ready to settle every detail of the `Alloc` trait for the purpose of collections that are generic over the allocator type (for example the possibility of a separate trait for deallocation only, and what that would look like exactly), we propose introducing separately **a new `GlobalAlloc` trait**, for use with the `#[global_allocator]` attribute. We also propose a number of changes to existing APIs. They are batched in this one PR in order to minimize disruption to Nightly users. The plan for initial stabilization is detailed in the tracking issue #49668. CC @rust-lang/libs, @glandium ## Immediate breaking changes to unstable features * For pointers to allocated memory, change the pointed type from `u8` to `Opaque`, a new public [extern type](#43467). Since extern types are not `Sized`, `<*mut _>::offset` cannot be used without first casting to another pointer type. (We hope that extern types can also be stabilized soon.) * In the `Alloc` trait, change these pointers to `ptr::NonNull` and change the `AllocErr` type to a zero-size struct. This makes return types `Result<ptr::NonNull<Opaque>, AllocErr>` be pointer-sized. * Instead of a new `Layout`, `realloc` takes only a new size (in addition to the pointer and old `Layout`). Changing the alignment is not supported with `realloc`. * Change the return type of `Layout::from_size_align` from `Option<Self>` to `Result<Self, LayoutErr>`, with `LayoutErr` a new opaque struct. * A `static` item registered as the global allocator with the `#[global_allocator]` **must now implement the new `GlobalAlloc` trait** instead of `Alloc`. ## Eventually-breaking changes to unstable features, with a deprecation period * Rename the respective `heap` modules to `alloc` in the `core`, `alloc`, and `std` crates. (Yes, this does mean that `::alloc::alloc::Alloc::alloc` is a valid path to a trait method if you have `exetrn crate alloc;`) * Rename the the `Heap` type to `Global`, since it is the entry point for what’s registered with `#[global_allocator]`. Old names remain available for now, as deprecated `pub use` reexports. ## Backward-compatible changes * Add a new [extern type](#43467) `Opaque`, for use in pointers to allocated memory. * Add a new `GlobalAlloc` trait shown below. Unlike `Alloc`, it uses bare `*mut Opaque` without `NonNull` or `Result`. NULL in return values indicates an error (of unspecified nature). This is easier to implement on top of `malloc`-like APIs. * Add impls of `GlobalAlloc` for both the `Global` and `System` types, in addition to existing impls of `Alloc`. This enables calling `GlobalAlloc` methods on the stable channel before `Alloc` is stable. Implementing two traits with identical method names can make some calls ambiguous, but most code is expected to have no more than one of the two traits in scope. Erroneous code like `use std::alloc::Global; #[global_allocator] static A: Global = Global;` (where `Global` is defined to call itself, causing infinite recursion) is not statically prevented by the type system, but we count on it being hard enough to do accidentally and easy enough to diagnose. ```rust extern { pub type Opaque; } pub unsafe trait GlobalAlloc { unsafe fn alloc(&self, layout: Layout) -> *mut Opaque; unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout); unsafe fn alloc_zeroed(&self, layout: Layout) -> *mut Opaque { // Default impl: self.alloc() and ptr::write_bytes() } unsafe fn realloc(&self, ptr: *mut Opaque, old_layout: Layout, new_size: usize) -> *mut Opaque { // Default impl: self.alloc() and ptr::copy_nonoverlapping() and self.dealloc() } fn oom(&self) -> ! { // intrinsics::abort } // More methods with default impls may be added in the future } ``` ## Bikeshed The tracking issue #49668 lists some open questions. If consensus is reached before this PR is merged, changes can be integrated.
2 parents f9f9050 + c5ffdd7 commit 99d4886

File tree

56 files changed

+1038
-1496
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

56 files changed

+1038
-1496
lines changed

src/Cargo.lock

-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

src/doc/nomicon

src/doc/unstable-book/src/language-features/global-allocator.md

+5-4
Original file line numberDiff line numberDiff line change
@@ -29,16 +29,17 @@ looks like:
2929
```rust
3030
#![feature(global_allocator, allocator_api, heap_api)]
3131

32-
use std::heap::{Alloc, System, Layout, AllocErr};
32+
use std::alloc::{GlobalAlloc, System, Layout, Opaque};
33+
use std::ptr::NonNull;
3334

3435
struct MyAllocator;
3536

36-
unsafe impl<'a> Alloc for &'a MyAllocator {
37-
unsafe fn alloc(&mut self, layout: Layout) -> Result<*mut u8, AllocErr> {
37+
unsafe impl GlobalAlloc for MyAllocator {
38+
unsafe fn alloc(&self, layout: Layout) -> *mut Opaque {
3839
System.alloc(layout)
3940
}
4041

41-
unsafe fn dealloc(&mut self, ptr: *mut u8, layout: Layout) {
42+
unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout) {
4243
System.dealloc(ptr, layout)
4344
}
4445
}

src/liballoc/alloc.rs

+215
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,215 @@
1+
// Copyright 2014-2015 The Rust Project Developers. See the COPYRIGHT
2+
// file at the top-level directory of this distribution and at
3+
// http://rust-lang.org/COPYRIGHT.
4+
//
5+
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
6+
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
7+
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
8+
// option. This file may not be copied, modified, or distributed
9+
// except according to those terms.
10+
11+
#![unstable(feature = "allocator_api",
12+
reason = "the precise API and guarantees it provides may be tweaked \
13+
slightly, especially to possibly take into account the \
14+
types being stored to make room for a future \
15+
tracing garbage collector",
16+
issue = "32838")]
17+
18+
use core::intrinsics::{min_align_of_val, size_of_val};
19+
use core::ptr::NonNull;
20+
use core::usize;
21+
22+
#[doc(inline)]
23+
pub use core::alloc::*;
24+
25+
#[cfg(stage0)]
26+
extern "Rust" {
27+
#[allocator]
28+
#[rustc_allocator_nounwind]
29+
fn __rust_alloc(size: usize, align: usize, err: *mut u8) -> *mut u8;
30+
#[cold]
31+
#[rustc_allocator_nounwind]
32+
fn __rust_oom(err: *const u8) -> !;
33+
#[rustc_allocator_nounwind]
34+
fn __rust_dealloc(ptr: *mut u8, size: usize, align: usize);
35+
#[rustc_allocator_nounwind]
36+
fn __rust_realloc(ptr: *mut u8,
37+
old_size: usize,
38+
old_align: usize,
39+
new_size: usize,
40+
new_align: usize,
41+
err: *mut u8) -> *mut u8;
42+
#[rustc_allocator_nounwind]
43+
fn __rust_alloc_zeroed(size: usize, align: usize, err: *mut u8) -> *mut u8;
44+
}
45+
46+
#[cfg(not(stage0))]
47+
extern "Rust" {
48+
#[allocator]
49+
#[rustc_allocator_nounwind]
50+
fn __rust_alloc(size: usize, align: usize) -> *mut u8;
51+
#[cold]
52+
#[rustc_allocator_nounwind]
53+
fn __rust_oom() -> !;
54+
#[rustc_allocator_nounwind]
55+
fn __rust_dealloc(ptr: *mut u8, size: usize, align: usize);
56+
#[rustc_allocator_nounwind]
57+
fn __rust_realloc(ptr: *mut u8,
58+
old_size: usize,
59+
align: usize,
60+
new_size: usize) -> *mut u8;
61+
#[rustc_allocator_nounwind]
62+
fn __rust_alloc_zeroed(size: usize, align: usize) -> *mut u8;
63+
}
64+
65+
#[derive(Copy, Clone, Default, Debug)]
66+
pub struct Global;
67+
68+
#[unstable(feature = "allocator_api", issue = "32838")]
69+
#[rustc_deprecated(since = "1.27.0", reason = "type renamed to `Global`")]
70+
pub type Heap = Global;
71+
72+
#[unstable(feature = "allocator_api", issue = "32838")]
73+
#[rustc_deprecated(since = "1.27.0", reason = "type renamed to `Global`")]
74+
#[allow(non_upper_case_globals)]
75+
pub const Heap: Global = Global;
76+
77+
unsafe impl GlobalAlloc for Global {
78+
#[inline]
79+
unsafe fn alloc(&self, layout: Layout) -> *mut Opaque {
80+
#[cfg(not(stage0))]
81+
let ptr = __rust_alloc(layout.size(), layout.align());
82+
#[cfg(stage0)]
83+
let ptr = __rust_alloc(layout.size(), layout.align(), &mut 0);
84+
ptr as *mut Opaque
85+
}
86+
87+
#[inline]
88+
unsafe fn dealloc(&self, ptr: *mut Opaque, layout: Layout) {
89+
__rust_dealloc(ptr as *mut u8, layout.size(), layout.align())
90+
}
91+
92+
#[inline]
93+
unsafe fn realloc(&self, ptr: *mut Opaque, layout: Layout, new_size: usize) -> *mut Opaque {
94+
#[cfg(not(stage0))]
95+
let ptr = __rust_realloc(ptr as *mut u8, layout.size(), layout.align(), new_size);
96+
#[cfg(stage0)]
97+
let ptr = __rust_realloc(ptr as *mut u8, layout.size(), layout.align(),
98+
new_size, layout.align(), &mut 0);
99+
ptr as *mut Opaque
100+
}
101+
102+
#[inline]
103+
unsafe fn alloc_zeroed(&self, layout: Layout) -> *mut Opaque {
104+
#[cfg(not(stage0))]
105+
let ptr = __rust_alloc_zeroed(layout.size(), layout.align());
106+
#[cfg(stage0)]
107+
let ptr = __rust_alloc_zeroed(layout.size(), layout.align(), &mut 0);
108+
ptr as *mut Opaque
109+
}
110+
111+
#[inline]
112+
fn oom(&self) -> ! {
113+
unsafe {
114+
#[cfg(not(stage0))]
115+
__rust_oom();
116+
#[cfg(stage0)]
117+
__rust_oom(&mut 0);
118+
}
119+
}
120+
}
121+
122+
unsafe impl Alloc for Global {
123+
#[inline]
124+
unsafe fn alloc(&mut self, layout: Layout) -> Result<NonNull<Opaque>, AllocErr> {
125+
NonNull::new(GlobalAlloc::alloc(self, layout)).ok_or(AllocErr)
126+
}
127+
128+
#[inline]
129+
unsafe fn dealloc(&mut self, ptr: NonNull<Opaque>, layout: Layout) {
130+
GlobalAlloc::dealloc(self, ptr.as_ptr(), layout)
131+
}
132+
133+
#[inline]
134+
unsafe fn realloc(&mut self,
135+
ptr: NonNull<Opaque>,
136+
layout: Layout,
137+
new_size: usize)
138+
-> Result<NonNull<Opaque>, AllocErr>
139+
{
140+
NonNull::new(GlobalAlloc::realloc(self, ptr.as_ptr(), layout, new_size)).ok_or(AllocErr)
141+
}
142+
143+
#[inline]
144+
unsafe fn alloc_zeroed(&mut self, layout: Layout) -> Result<NonNull<Opaque>, AllocErr> {
145+
NonNull::new(GlobalAlloc::alloc_zeroed(self, layout)).ok_or(AllocErr)
146+
}
147+
148+
#[inline]
149+
fn oom(&mut self) -> ! {
150+
GlobalAlloc::oom(self)
151+
}
152+
}
153+
154+
/// The allocator for unique pointers.
155+
// This function must not unwind. If it does, MIR trans will fail.
156+
#[cfg(not(test))]
157+
#[lang = "exchange_malloc"]
158+
#[inline]
159+
unsafe fn exchange_malloc(size: usize, align: usize) -> *mut u8 {
160+
if size == 0 {
161+
align as *mut u8
162+
} else {
163+
let layout = Layout::from_size_align_unchecked(size, align);
164+
let ptr = Global.alloc(layout);
165+
if !ptr.is_null() {
166+
ptr as *mut u8
167+
} else {
168+
Global.oom()
169+
}
170+
}
171+
}
172+
173+
#[cfg_attr(not(test), lang = "box_free")]
174+
#[inline]
175+
pub(crate) unsafe fn box_free<T: ?Sized>(ptr: *mut T) {
176+
let size = size_of_val(&*ptr);
177+
let align = min_align_of_val(&*ptr);
178+
// We do not allocate for Box<T> when T is ZST, so deallocation is also not necessary.
179+
if size != 0 {
180+
let layout = Layout::from_size_align_unchecked(size, align);
181+
Global.dealloc(ptr as *mut Opaque, layout);
182+
}
183+
}
184+
185+
#[cfg(test)]
186+
mod tests {
187+
extern crate test;
188+
use self::test::Bencher;
189+
use boxed::Box;
190+
use alloc::{Global, Alloc, Layout};
191+
192+
#[test]
193+
fn allocate_zeroed() {
194+
unsafe {
195+
let layout = Layout::from_size_align(1024, 1).unwrap();
196+
let ptr = Global.alloc_zeroed(layout.clone())
197+
.unwrap_or_else(|_| Global.oom());
198+
199+
let mut i = ptr.cast::<u8>().as_ptr();
200+
let end = i.offset(layout.size() as isize);
201+
while i < end {
202+
assert_eq!(*i, 0);
203+
i = i.offset(1);
204+
}
205+
Global.dealloc(ptr, layout);
206+
}
207+
}
208+
209+
#[bench]
210+
fn alloc_owned_small(b: &mut Bencher) {
211+
b.iter(|| {
212+
let _: Box<_> = box 10;
213+
})
214+
}
215+
}

src/liballoc/arc.rs

+9-14
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ use core::sync::atomic::Ordering::{Acquire, Relaxed, Release, SeqCst};
2121
use core::borrow;
2222
use core::fmt;
2323
use core::cmp::Ordering;
24-
use core::heap::{Alloc, Layout};
2524
use core::intrinsics::abort;
2625
use core::mem::{self, align_of_val, size_of_val, uninitialized};
2726
use core::ops::Deref;
@@ -32,7 +31,7 @@ use core::hash::{Hash, Hasher};
3231
use core::{isize, usize};
3332
use core::convert::From;
3433

35-
use heap::{Heap, box_free};
34+
use alloc::{Global, Alloc, Layout, box_free};
3635
use boxed::Box;
3736
use string::String;
3837
use vec::Vec;
@@ -513,15 +512,13 @@ impl<T: ?Sized> Arc<T> {
513512
// Non-inlined part of `drop`.
514513
#[inline(never)]
515514
unsafe fn drop_slow(&mut self) {
516-
let ptr = self.ptr.as_ptr();
517-
518515
// Destroy the data at this time, even though we may not free the box
519516
// allocation itself (there may still be weak pointers lying around).
520517
ptr::drop_in_place(&mut self.ptr.as_mut().data);
521518

522519
if self.inner().weak.fetch_sub(1, Release) == 1 {
523520
atomic::fence(Acquire);
524-
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
521+
Global.dealloc(self.ptr.as_opaque(), Layout::for_value(self.ptr.as_ref()))
525522
}
526523
}
527524

@@ -555,11 +552,11 @@ impl<T: ?Sized> Arc<T> {
555552

556553
let layout = Layout::for_value(&*fake_ptr);
557554

558-
let mem = Heap.alloc(layout)
559-
.unwrap_or_else(|e| Heap.oom(e));
555+
let mem = Global.alloc(layout)
556+
.unwrap_or_else(|_| Global.oom());
560557

561558
// Initialize the real ArcInner
562-
let inner = set_data_ptr(ptr as *mut T, mem) as *mut ArcInner<T>;
559+
let inner = set_data_ptr(ptr as *mut T, mem.as_ptr() as *mut u8) as *mut ArcInner<T>;
563560

564561
ptr::write(&mut (*inner).strong, atomic::AtomicUsize::new(1));
565562
ptr::write(&mut (*inner).weak, atomic::AtomicUsize::new(1));
@@ -626,7 +623,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
626623
// In the event of a panic, elements that have been written
627624
// into the new ArcInner will be dropped, then the memory freed.
628625
struct Guard<T> {
629-
mem: *mut u8,
626+
mem: NonNull<u8>,
630627
elems: *mut T,
631628
layout: Layout,
632629
n_elems: usize,
@@ -640,7 +637,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
640637
let slice = from_raw_parts_mut(self.elems, self.n_elems);
641638
ptr::drop_in_place(slice);
642639

643-
Heap.dealloc(self.mem, self.layout.clone());
640+
Global.dealloc(self.mem.as_opaque(), self.layout.clone());
644641
}
645642
}
646643
}
@@ -656,7 +653,7 @@ impl<T: Clone> ArcFromSlice<T> for Arc<[T]> {
656653
let elems = &mut (*ptr).data as *mut [T] as *mut T;
657654

658655
let mut guard = Guard{
659-
mem: mem,
656+
mem: NonNull::new_unchecked(mem),
660657
elems: elems,
661658
layout: layout,
662659
n_elems: 0,
@@ -1148,8 +1145,6 @@ impl<T: ?Sized> Drop for Weak<T> {
11481145
/// assert!(other_weak_foo.upgrade().is_none());
11491146
/// ```
11501147
fn drop(&mut self) {
1151-
let ptr = self.ptr.as_ptr();
1152-
11531148
// If we find out that we were the last weak pointer, then its time to
11541149
// deallocate the data entirely. See the discussion in Arc::drop() about
11551150
// the memory orderings
@@ -1161,7 +1156,7 @@ impl<T: ?Sized> Drop for Weak<T> {
11611156
if self.inner().weak.fetch_sub(1, Release) == 1 {
11621157
atomic::fence(Acquire);
11631158
unsafe {
1164-
Heap.dealloc(ptr as *mut u8, Layout::for_value(&*ptr))
1159+
Global.dealloc(self.ptr.as_opaque(), Layout::for_value(self.ptr.as_ref()))
11651160
}
11661161
}
11671162
}

0 commit comments

Comments
 (0)