In my project, I have a lot of situations like this:
constexpr size_t element_count = 42;
std::array<bool, element_count> elements;
for(size_t i = 0; i < element_count; ++i){
if(i > 0 && elements[i - 1]){/*do something*/}
else{/*do something else*/}
if(i < element_count - 1 && elements[i + 1]){/*do something*/}
else{/*do something else*/}
}
Without checking if i > 0
or i < element_count
, I'll get undefined behavior. If I use std::array::at
instead of operator[]
, I can get std::out_of_range
exceptions instead. I was wondering if there were any problems with just relying on the exception like this:
for(size_t i = 0; i < element_count; ++i){
try{
if(elements.at(i - 1)){/*do something*/}
}
catch(const std::out_of_range& e){/*do something else*/}
try{
if(elements.at(i + 1)){/*do something*/}
}
catch(const std::out_of_range& e){/*do something else*/}
}
In this example it's more code, but in my real project it would reduce the amount of code because I'm using lots of multidimensional arrays and doing bounds checking for multiple dimensions.