Abstract
The process of adding new blocks to a blockchain, known as mining, is an energy intensive task. This can be especially challenging over low-power Internet-of-Things (IoT) nodes with limited resources. Any blockchain protocol over IoT must consider the effect of mining on the actual power expenditure and performance of the nodes. In this work, we design a thorough experiment whereby we obtain an empirical understanding of the nature of energy consumption with block mining over IoT. Specifically, considering Proof-of-Work (PoW) based consensus and a set of IoT nodes, we execute the most intensive computation of mining, viz. hashing over those nodes with different block difficulty levels and record the actual energy consumed for each. We plot the data against the difficulty, and for each observation, we obtain a curve of best-fit that captures the generic nature. We infer empirically that the amount of energy consumed for any generic IoT node and for any hash algorithm is parabolic with respect to increasing difficulty. We also find that beyond a certain upper difficulty limit, block mining over IoT nodes becomes infeasible due to indefinite computation time. Our study suggests that using IoT devices for blockchain mining is achievable but requires careful selection and customization of the hashing algorithm and hardware, depending on specific IoT scenarios. Towards efficient blockchain architecture over IoT, our research provides an empirical base that allows the formulation of effective mining strategies.