nodeinfo: Phase out cpu_set_t usage
Swap out all instances of cpu_set_t and replace them with virBitmap, which some of the code was already using anyway. The changes are pretty mechanical, with one notable exception: an assumption has been added on the max value we can run into while reading either socket_it or core_id. While this specific assumption was not in place before, we were using cpu_set_t improperly by not making sure not to set any bit past CPU_SETSIZE or explicitly allocating bigger bitmaps; in fact the default size of a cpu_set_t, 1024, is way too low to run our testsuite, which includes core_id values in the 2000s.
Showing
想要评论请 注册 或 登录