Resource allocation (computing)
(Redirected from Resource allocation (computer))
This article appears to be a dictionary definition. (April 2025) |
Resource allocation is the process by which a computing system aims to meet the hardware requirements of an application run by it.[1] Computing, networking and energy resources must be optimised taking into account hardware, performance and environmental restrictions.[2] This process may be undertaken by the hardware itself,[3] an operating system, a distributed computing system,[4] or as part of data center management.
See also
[edit | edit source]References
[edit | edit source]- ^ Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).
- ^ Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).
- ^ Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).
- ^ Lua error in Module:Citation/CS1/Configuration at line 2172: attempt to index field '?' (a nil value).