Skip to content

Commit c7af9e2

Browse files
authored
Auto merge of #238 - epilys:master, r=mbrubeck
Return allocation error in deserialize instead of panicking There's no way to catch allocation errors since out of memory errors cause an abort. Fail gracefully by returning the error instead of panicking. I happened upon this error when deserializing untrusted data with bincode. Bincode provides a byte limit bound but for sequences it's not possible to enforce this through serde since collection types like smallvec handle their own allocation.
2 parents 28fb0f4 + d1394a0 commit c7af9e2

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

src/lib.rs

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -233,6 +233,12 @@ pub enum CollectionAllocErr {
233233
},
234234
}
235235

236+
impl fmt::Display for CollectionAllocErr {
237+
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
238+
write!(f, "Allocation error: {:?}", self)
239+
}
240+
}
241+
236242
impl From<LayoutErr> for CollectionAllocErr {
237243
fn from(_: LayoutErr) -> Self {
238244
CollectionAllocErr::CapacityOverflow
@@ -1543,8 +1549,10 @@ where
15431549
where
15441550
B: SeqAccess<'de>,
15451551
{
1552+
use serde::de::Error;
15461553
let len = seq.size_hint().unwrap_or(0);
1547-
let mut values = SmallVec::with_capacity(len);
1554+
let mut values = SmallVec::new();
1555+
values.try_reserve(len).map_err(B::Error::custom)?;
15481556

15491557
while let Some(value) = seq.next_element()? {
15501558
values.push(value);

0 commit comments

Comments
 (0)