Initial commit
This commit is contained in:
4
.gitignore
vendored
Normal file
4
.gitignore
vendored
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
__pycache__/
|
||||||
|
venv/
|
||||||
|
.venv/
|
||||||
|
config.py
|
||||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2026 Davide Grilli
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
180
README.md
Normal file
180
README.md
Normal file
@@ -0,0 +1,180 @@
|
|||||||
|
# SHA-256 Miner (From Scratch, Python Starter)
|
||||||
|
|
||||||
|
A clean, production-minded **starting point** for building SHA-256 miners from scratch.
|
||||||
|
|
||||||
|
This repository provides a complete, readable Python implementation of a Bitcoin-style mining pipeline using `getblocktemplate` and JSON-RPC. It is intentionally written in Python to keep the architecture simple, inspectable, and easy to evolve.
|
||||||
|
|
||||||
|
## Positioning
|
||||||
|
|
||||||
|
This project is designed for engineers who want to:
|
||||||
|
- understand mining internals end-to-end
|
||||||
|
- customize nonce-search and block-building logic
|
||||||
|
- use a practical base before moving performance-critical parts to lower-level languages
|
||||||
|
|
||||||
|
This project is **not** an ASIC-grade miner and is **not** intended for profitable mainnet mining.
|
||||||
|
|
||||||
|
## What You Get
|
||||||
|
|
||||||
|
- End-to-end mining flow (`template -> coinbase -> merkle -> header -> PoW -> submit`)
|
||||||
|
- Multiprocess worker orchestration
|
||||||
|
- Worker-specific `extranonce2` generation
|
||||||
|
- SegWit-aware template processing
|
||||||
|
- Coinbase construction with custom message support
|
||||||
|
- Block serialization and submission to Bitcoin Core
|
||||||
|
- Watchdog-based restart on chain tip updates
|
||||||
|
- Live aggregated dashboard (attempts/hashrate by worker)
|
||||||
|
|
||||||
|
## Repository Layout
|
||||||
|
|
||||||
|
- `launcher.py`: supervisor, process lifecycle, dashboard
|
||||||
|
- `main.py`: single-worker mining loop
|
||||||
|
- `miner.py`: PoW nonce search engine
|
||||||
|
- `block_builder.py`: coinbase/header/block assembly
|
||||||
|
- `rpc.py`: Bitcoin Core RPC client helpers
|
||||||
|
- `utils.py`: hashing, target conversion, watchdog logic
|
||||||
|
- `config.py`: runtime configuration
|
||||||
|
- `config.py.example`: full configuration template
|
||||||
|
|
||||||
|
## Architecture Overview
|
||||||
|
|
||||||
|
1. `launcher.py` starts N worker processes.
|
||||||
|
2. Each worker requests a fresh block template from Bitcoin Core.
|
||||||
|
3. Coinbase is built using block height, optional message, `EXTRANONCE1`, and worker-specific `EXTRANONCE2`.
|
||||||
|
4. Merkle root and block header are assembled.
|
||||||
|
5. `miner.py` searches for a valid nonce according to the selected strategy.
|
||||||
|
6. If a valid header is found, the block is serialized and submitted.
|
||||||
|
7. Workers restart with a new template after successful submission or tip change.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- Python 3.10+
|
||||||
|
- Bitcoin Core node with RPC enabled
|
||||||
|
- Recommended network for development: `regtest`
|
||||||
|
|
||||||
|
Install:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python -m venv .venv
|
||||||
|
source .venv/bin/activate
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
## Bitcoin Core Setup
|
||||||
|
|
||||||
|
Example `bitcoin.conf` values:
|
||||||
|
|
||||||
|
```ini
|
||||||
|
server=1
|
||||||
|
rpcuser=user
|
||||||
|
rpcpassword=password
|
||||||
|
rpcallowip=127.0.0.1
|
||||||
|
rpcport=18443
|
||||||
|
regtest=1
|
||||||
|
```
|
||||||
|
|
||||||
|
Use values consistent with your local node and wallet setup.
|
||||||
|
|
||||||
|
## Configuration (`config.py`)
|
||||||
|
|
||||||
|
All runtime settings are defined in `config.py`.
|
||||||
|
Use `config.py.example` as the canonical template.
|
||||||
|
|
||||||
|
Minimal example:
|
||||||
|
|
||||||
|
```python
|
||||||
|
import multiprocessing as mp
|
||||||
|
|
||||||
|
# RPC
|
||||||
|
RPC_USER = "user"
|
||||||
|
RPC_PASSWORD = "password"
|
||||||
|
RPC_HOST = "127.0.0.1"
|
||||||
|
RPC_PORT = 8332
|
||||||
|
|
||||||
|
# Wallet
|
||||||
|
WALLET_ADDRESS = ""
|
||||||
|
|
||||||
|
# Mining
|
||||||
|
DIFFICULTY_FACTOR = 1.0
|
||||||
|
NONCE_MODE = "incremental" # incremental | random | mixed
|
||||||
|
TIMESTAMP_UPDATE_INTERVAL = 30
|
||||||
|
BATCH = 10_000
|
||||||
|
COINBASE_MESSAGE = "/py-miner/"
|
||||||
|
EXTRANONCE1 = "1234567890abcdef"
|
||||||
|
EXTRANONCE2 = "12341234"
|
||||||
|
|
||||||
|
# Workers
|
||||||
|
_NUM_PROCESSORS = 0
|
||||||
|
NUM_PROCESSORS = _NUM_PROCESSORS if _NUM_PROCESSORS > 0 else mp.cpu_count()
|
||||||
|
|
||||||
|
# Watchdog
|
||||||
|
CHECK_INTERVAL = 20
|
||||||
|
```
|
||||||
|
|
||||||
|
## Supported Configuration Behavior
|
||||||
|
|
||||||
|
- `NONCE_MODE`: must be `incremental`, `random`, or `mixed`
|
||||||
|
- `TIMESTAMP_UPDATE_INTERVAL`: seconds, `0` disables periodic timestamp updates
|
||||||
|
- `BATCH`: number of nonces per compute batch
|
||||||
|
- `NUM_PROCESSORS`:
|
||||||
|
- `> 0`: exact number of workers
|
||||||
|
- `<= 0`: auto CPU count
|
||||||
|
- `DIFFICULTY_FACTOR`:
|
||||||
|
- on `regtest`:
|
||||||
|
- `0`: keep original template target
|
||||||
|
- `> 0`: compute target as `max_target / factor`
|
||||||
|
- `< 0`: clamped to `0.1`
|
||||||
|
- on non-`regtest`: forced to `1.0`
|
||||||
|
|
||||||
|
## Run
|
||||||
|
|
||||||
|
Default:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python launcher.py
|
||||||
|
```
|
||||||
|
|
||||||
|
CLI options:
|
||||||
|
- `-n`, `--num-procs`: number of workers
|
||||||
|
- `--base-extranonce2`: base hex value for per-worker extranonce2 derivation
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python launcher.py -n 4 --base-extranonce2 12341234
|
||||||
|
```
|
||||||
|
|
||||||
|
## Operational Notes
|
||||||
|
|
||||||
|
- `EXTRANONCE1` and `EXTRANONCE2` must be valid hex strings.
|
||||||
|
- Coinbase `scriptSig` must remain within protocol constraints (the implementation enforces a 100-byte limit).
|
||||||
|
- Invalid RPC credentials or invalid wallet address will fail at runtime.
|
||||||
|
|
||||||
|
## Tested Capacity
|
||||||
|
|
||||||
|
Reference test result for this miner implementation:
|
||||||
|
- Platform: Raspberry Pi 5
|
||||||
|
- CPU usage: 4 cores
|
||||||
|
- Observed throughput: **~1.8 MH/s**
|
||||||
|
|
||||||
|
Performance depends on clock profile, thermal conditions, operating system, and background load.
|
||||||
|
|
||||||
|
## Current Limits (By Design)
|
||||||
|
|
||||||
|
This is a Python starter implementation. It intentionally does not include:
|
||||||
|
- ASIC/GPU kernels
|
||||||
|
- Stratum server/client stack
|
||||||
|
- advanced failover/job persistence
|
||||||
|
- extensive benchmark/profiling harness
|
||||||
|
|
||||||
|
## Recommended Evolution Path
|
||||||
|
|
||||||
|
1. Add strict schema validation for `config.py`.
|
||||||
|
2. Add deterministic unit tests for block assembly and hashing paths.
|
||||||
|
3. Introduce microbenchmarks for nonce-loop and hashing batches.
|
||||||
|
4. Add integration tests against disposable `bitcoind` `regtest` nodes.
|
||||||
|
5. Extract core components into reusable packages (RPC, builder, PoW engine).
|
||||||
|
6. Move performance-critical sections to C/Rust/CUDA while preserving Python orchestration.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT. See [LICENSE](LICENSE).
|
||||||
167
block_builder.py
Normal file
167
block_builder.py
Normal file
@@ -0,0 +1,167 @@
|
|||||||
|
import struct, logging
|
||||||
|
from binascii import unhexlify, hexlify
|
||||||
|
from utils import double_sha256, encode_varint
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
def tx_encode_coinbase_height(height: int) -> str:
|
||||||
|
"""
|
||||||
|
Encode block height according to BIP34 (CScriptNum format) to include it
|
||||||
|
in the coinbase transaction scriptSig.
|
||||||
|
"""
|
||||||
|
if height < 0:
|
||||||
|
raise ValueError("Block height must be greater than or equal to 0.")
|
||||||
|
if height == 0:
|
||||||
|
return "00"
|
||||||
|
result = bytearray()
|
||||||
|
v = height
|
||||||
|
while v:
|
||||||
|
result.append(v & 0xff)
|
||||||
|
v >>= 8
|
||||||
|
# If the most significant bit of the last byte is 1, append 0x00
|
||||||
|
if result and (result[-1] & 0x80):
|
||||||
|
result.append(0x00)
|
||||||
|
return f"{len(result):02x}" + result.hex()
|
||||||
|
|
||||||
|
def is_segwit_tx(raw_hex: str) -> bool:
|
||||||
|
"""
|
||||||
|
Return True if the transaction is serialized in SegWit format.
|
||||||
|
"""
|
||||||
|
return len(raw_hex) >= 12 and raw_hex[8:12] == "0001"
|
||||||
|
|
||||||
|
def build_coinbase_transaction(template, miner_script_pubkey, extranonce1, extranonce2, coinbase_message=None):
|
||||||
|
"""Build a coinbase transaction for mining."""
|
||||||
|
height = template["height"]
|
||||||
|
reward = template["coinbasevalue"]
|
||||||
|
wc_raw = template.get("default_witness_commitment") # can be full script or root only
|
||||||
|
segwit = bool(wc_raw)
|
||||||
|
|
||||||
|
tx_version = struct.pack("<I", 2).hex() # coinbase tx version 2 (BIP68/SegWit)
|
||||||
|
parts = [tx_version]
|
||||||
|
if segwit:
|
||||||
|
parts.append("0001") # marker + flag
|
||||||
|
|
||||||
|
# ---- coinbase input ------------------------------------------------
|
||||||
|
parts += ["01", "00"*32, "ffffffff"]
|
||||||
|
|
||||||
|
script_sig = tx_encode_coinbase_height(height)
|
||||||
|
if coinbase_message:
|
||||||
|
m = coinbase_message.encode()
|
||||||
|
script_sig += "6a" + f"{len(m):02x}" + m.hex()
|
||||||
|
# Add extranonce1 and extranonce2 as required by Stratum V1
|
||||||
|
script_sig += extranonce1 + extranonce2
|
||||||
|
|
||||||
|
if len(script_sig)//2 > 100:
|
||||||
|
raise ValueError("scriptSig > 100 byte")
|
||||||
|
|
||||||
|
parts.append(encode_varint(len(script_sig)//2) + script_sig)
|
||||||
|
parts.append("ffffffff") # sequence
|
||||||
|
|
||||||
|
# ---- outputs -------------------------------------------------------
|
||||||
|
outputs = []
|
||||||
|
|
||||||
|
miner_out = struct.pack("<Q", reward).hex()
|
||||||
|
miner_out += encode_varint(len(miner_script_pubkey)//2) + miner_script_pubkey
|
||||||
|
outputs.append(miner_out)
|
||||||
|
|
||||||
|
if segwit:
|
||||||
|
# wc_script is already complete if it starts with '6a'
|
||||||
|
if wc_raw.startswith("6a"):
|
||||||
|
wc_script = wc_raw
|
||||||
|
else: # root only: build full script
|
||||||
|
wc_script = "6a24aa21a9ed" + wc_raw
|
||||||
|
outputs.append("00"*8 + encode_varint(len(wc_script)//2) + wc_script)
|
||||||
|
|
||||||
|
parts.append(encode_varint(len(outputs)) + "".join(outputs))
|
||||||
|
|
||||||
|
# ---- reserved witness ---------------------------------------------
|
||||||
|
if segwit:
|
||||||
|
parts += ["01", "20", "00"*32] # 1 item x 32 bytes
|
||||||
|
|
||||||
|
parts.append("00000000") # locktime
|
||||||
|
coinbase_hex = "".join(parts)
|
||||||
|
|
||||||
|
# ---------- legacy txid (without marker/flag + witness) -------------
|
||||||
|
if segwit:
|
||||||
|
# 1) remove marker+flag "0001"
|
||||||
|
core = tx_version + coinbase_hex[12:]
|
||||||
|
|
||||||
|
# 2) split locktime (last 8 hex chars)
|
||||||
|
locktime = core[-8:] # "00000000"
|
||||||
|
body = core[:-8] # everything except locktime
|
||||||
|
|
||||||
|
# 3) remove witness stack (68 hex) right before locktime
|
||||||
|
body_wo_wit = body[:-68] # remove only the 34 witness bytes
|
||||||
|
|
||||||
|
# 4) rebuild body + locktime
|
||||||
|
core = body_wo_wit + locktime
|
||||||
|
else:
|
||||||
|
core = coinbase_hex
|
||||||
|
|
||||||
|
txid = double_sha256(unhexlify(core))[::-1].hex()
|
||||||
|
return coinbase_hex, txid
|
||||||
|
|
||||||
|
def calculate_merkle_root(coinbase_txid: str, transactions: list[dict]) -> str:
|
||||||
|
"""Compute the Merkle root for a list of transaction IDs."""
|
||||||
|
# leaves in little-endian bytes format
|
||||||
|
tx_hashes = [unhexlify(coinbase_txid)[::-1]] + [
|
||||||
|
unhexlify(tx["hash"])[::-1] for tx in transactions
|
||||||
|
]
|
||||||
|
|
||||||
|
while len(tx_hashes) > 1:
|
||||||
|
if len(tx_hashes) % 2:
|
||||||
|
tx_hashes.append(tx_hashes[-1])
|
||||||
|
tx_hashes = [
|
||||||
|
double_sha256(tx_hashes[i] + tx_hashes[i+1])
|
||||||
|
for i in range(0, len(tx_hashes), 2)
|
||||||
|
]
|
||||||
|
|
||||||
|
return hexlify(tx_hashes[0][::-1]).decode()
|
||||||
|
|
||||||
|
def build_block_header(version, prev_hash, merkle_root, timestamp, bits, nonce):
|
||||||
|
"""
|
||||||
|
Build the 80-byte block header and return it as a hex string.
|
||||||
|
"""
|
||||||
|
# Build header by concatenating all fields in binary format
|
||||||
|
header = (
|
||||||
|
struct.pack("<I", version) + # Version (4 byte, little-endian)
|
||||||
|
unhexlify(prev_hash)[::-1] + # Previous Block Hash (32 bytes, reversed)
|
||||||
|
unhexlify(merkle_root)[::-1] + # Merkle Root (32 bytes, reversed)
|
||||||
|
struct.pack("<I", timestamp) + # Timestamp (4 byte, little-endian)
|
||||||
|
unhexlify(bits)[::-1] + # Bits/Target (4 bytes, reversed)
|
||||||
|
struct.pack("<I", nonce) # Nonce (4 byte, little-endian)
|
||||||
|
)
|
||||||
|
# Convert binary header to hexadecimal
|
||||||
|
return hexlify(header).decode()
|
||||||
|
|
||||||
|
def serialize_block(header_hex, coinbase_tx, transactions):
|
||||||
|
"""
|
||||||
|
Serialize the full block in Bitcoin wire format.
|
||||||
|
"""
|
||||||
|
# High-level message -> INFO
|
||||||
|
log.info("Block serialization started")
|
||||||
|
|
||||||
|
# Compute total transaction count (coinbase + regular transactions)
|
||||||
|
num_tx = len(transactions) + 1 # +1 to include coinbase
|
||||||
|
# Encode transaction count as hex VarInt
|
||||||
|
num_tx_hex = encode_varint(num_tx)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Concatenate all regular transactions in hex
|
||||||
|
transactions_hex = "".join(tx["data"] for tx in transactions)
|
||||||
|
except KeyError as e:
|
||||||
|
# Operational error -> ERROR (includes stack trace)
|
||||||
|
log.exception("A transaction is missing the 'data' field")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Assemble full block: header + tx count + coinbase + remaining txs
|
||||||
|
block_hex = header_hex + num_tx_hex + coinbase_tx + transactions_hex
|
||||||
|
|
||||||
|
# Success confirmation -> INFO
|
||||||
|
log.info("Block serialized successfully - %d total transactions", num_tx)
|
||||||
|
|
||||||
|
# Verbose detail (can be thousands of characters) -> DEBUG
|
||||||
|
log.debug("Block HEX: %s", block_hex)
|
||||||
|
|
||||||
|
# Return serialized block
|
||||||
|
return block_hex
|
||||||
67
config.py.example
Normal file
67
config.py.example
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
"""Complete miner configuration example.
|
||||||
|
|
||||||
|
This file documents all options supported by the program.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import multiprocessing as mp
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# RPC (Bitcoin Core)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
RPC_USER = "user"
|
||||||
|
RPC_PASSWORD = "password"
|
||||||
|
RPC_HOST = "127.0.0.1"
|
||||||
|
RPC_PORT = 18443
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Wallet
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Address that receives the coinbase reward.
|
||||||
|
# It must be valid for the node you are connected to.
|
||||||
|
WALLET_ADDRESS = ""
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Mining
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Network behavior:
|
||||||
|
# - regtest: 0 -> use network target; >0 -> max_target / factor; <0 -> forced to 0.1
|
||||||
|
# - non-regtest: ignored and always set to 1.0
|
||||||
|
DIFFICULTY_FACTOR = 1.0
|
||||||
|
|
||||||
|
# Supported values:
|
||||||
|
# - "incremental": initial nonce is 0, then increments
|
||||||
|
# - "random": initial nonce is random, then increments by batch
|
||||||
|
# - "mixed": initial nonce is random, then increments
|
||||||
|
NONCE_MODE = "incremental"
|
||||||
|
|
||||||
|
# Interval (seconds) to update the header timestamp.
|
||||||
|
# 0 disables periodic timestamp updates.
|
||||||
|
TIMESTAMP_UPDATE_INTERVAL = 30
|
||||||
|
|
||||||
|
# Number of nonces computed per batch (int > 0 recommended).
|
||||||
|
BATCH = 10_000
|
||||||
|
|
||||||
|
# Custom message inserted into coinbase scriptSig.
|
||||||
|
COINBASE_MESSAGE = "/py-miner/"
|
||||||
|
|
||||||
|
# Extranonce values used in coinbase scriptSig.
|
||||||
|
# Must be hexadecimal strings (only 0-9a-fA-F) with even length.
|
||||||
|
EXTRANONCE1 = "1234567890abcdef"
|
||||||
|
EXTRANONCE2 = "12341234"
|
||||||
|
|
||||||
|
# Practical note: total coinbase scriptSig must remain <= 100 bytes.
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Worker
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Same semantics as config.py:
|
||||||
|
# - >0: use that exact number of processes
|
||||||
|
# - <=0: use all available CPU cores
|
||||||
|
_NUM_PROCESSORS = 0
|
||||||
|
NUM_PROCESSORS = _NUM_PROCESSORS if _NUM_PROCESSORS > 0 else mp.cpu_count()
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Watchdog
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Best-block polling interval (seconds).
|
||||||
|
CHECK_INTERVAL = 20
|
||||||
198
launcher.py
Normal file
198
launcher.py
Normal file
@@ -0,0 +1,198 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import importlib
|
||||||
|
import logging
|
||||||
|
import multiprocessing as mp
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
|
||||||
|
import config
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Utilities
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _extranonce2(base: str, idx: int) -> str:
|
||||||
|
"""Return `base + idx` in hex, preserving the same width."""
|
||||||
|
return f"{int(base, 16) + idx:0{len(base)}x}"
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Worker
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _worker(idx: int, base_ex2: str, q: mp.Queue) -> None:
|
||||||
|
"""Start a mining process and send structured events to the supervisor."""
|
||||||
|
try:
|
||||||
|
os.sched_setaffinity(0, {idx})
|
||||||
|
except (AttributeError, OSError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Workers send structured events through the queue; verbose logs are suppressed
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.WARNING,
|
||||||
|
format="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
|
||||||
|
)
|
||||||
|
|
||||||
|
main = importlib.import_module("main")
|
||||||
|
try:
|
||||||
|
main.main(
|
||||||
|
event_queue=q,
|
||||||
|
worker_idx=idx,
|
||||||
|
extranonce2=_extranonce2(base_ex2, idx),
|
||||||
|
)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Supervisor
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _clear_lines(n: int) -> None:
|
||||||
|
for _ in range(n):
|
||||||
|
sys.stdout.write("\033[F\033[K")
|
||||||
|
sys.stdout.flush()
|
||||||
|
|
||||||
|
|
||||||
|
def _aggregate(q: mp.Queue, n: int) -> str:
|
||||||
|
"""
|
||||||
|
Receive structured events from workers and update the dashboard.
|
||||||
|
Return "restart" when a block is found and submitted.
|
||||||
|
"""
|
||||||
|
rates: list[float] = [0.0] * n
|
||||||
|
attempts: list[int] = [0] * n
|
||||||
|
block_hash: str | None = None
|
||||||
|
winner_idx: int | None = None
|
||||||
|
winner_rate: float | None = None
|
||||||
|
|
||||||
|
t_start = time.time()
|
||||||
|
last_print = 0.0
|
||||||
|
lines_printed = 0
|
||||||
|
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
tag, idx, val = q.get(timeout=0.1)
|
||||||
|
|
||||||
|
if tag == "status":
|
||||||
|
rates[idx] = val["rate"]
|
||||||
|
attempts[idx] = val["attempts"]
|
||||||
|
elif tag == "found":
|
||||||
|
winner_idx = idx
|
||||||
|
winner_rate = val.get("rate") if val else None
|
||||||
|
elif tag == "hash":
|
||||||
|
block_hash = val
|
||||||
|
elif tag == "submit":
|
||||||
|
_clear_lines(lines_printed)
|
||||||
|
elapsed = time.time() - t_start
|
||||||
|
total_att = sum(attempts)
|
||||||
|
avg_rate_k = total_att / elapsed / 1000 if elapsed else 0.0
|
||||||
|
print("=" * 78)
|
||||||
|
print("[✓] BLOCK FOUND AND SUBMITTED")
|
||||||
|
print(f" • Hash: {block_hash or 'N/D'}")
|
||||||
|
if winner_idx is not None:
|
||||||
|
print(f" • Worker: {winner_idx}")
|
||||||
|
if winner_rate is not None:
|
||||||
|
print(f" • Worker hashrate: {winner_rate:.2f} kH/s")
|
||||||
|
print(f" • Average total hashrate: {avg_rate_k:,.2f} kH/s")
|
||||||
|
print(f" • Total attempts: {total_att:,}")
|
||||||
|
print("=" * 78)
|
||||||
|
return "restart"
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
pass # empty queue
|
||||||
|
|
||||||
|
now = time.time()
|
||||||
|
if now - last_print >= 1.0:
|
||||||
|
if lines_printed > 0:
|
||||||
|
_clear_lines(lines_printed)
|
||||||
|
|
||||||
|
tot_rate = sum(rates)
|
||||||
|
tot_att = sum(attempts)
|
||||||
|
ts = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(now))
|
||||||
|
|
||||||
|
lines = [
|
||||||
|
f"{ts} | MINING STATUS",
|
||||||
|
"=" * 40,
|
||||||
|
f"Total: {tot_rate:,.2f} kH/s | Attempts: {tot_att:,}",
|
||||||
|
"-" * 40,
|
||||||
|
]
|
||||||
|
for i in range(n):
|
||||||
|
lines.append(f"Worker {i:<2}: {rates[i]:.2f} kH/s | Attempts: {attempts[i]:,}")
|
||||||
|
|
||||||
|
print("\n".join(lines), flush=True)
|
||||||
|
lines_printed = len(lines)
|
||||||
|
last_print = now
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Start/restart loop
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def launch(n: int, base_ex2: str) -> None:
|
||||||
|
log.info("Per-process extranonce2:")
|
||||||
|
for i in range(n):
|
||||||
|
log.info(" • Process %d: extranonce2=%s", i, _extranonce2(base_ex2, i))
|
||||||
|
|
||||||
|
while True:
|
||||||
|
q = mp.Queue()
|
||||||
|
workers = [
|
||||||
|
mp.Process(target=_worker, args=(i, base_ex2, q), daemon=True)
|
||||||
|
for i in range(n)
|
||||||
|
]
|
||||||
|
for p in workers:
|
||||||
|
p.start()
|
||||||
|
|
||||||
|
try:
|
||||||
|
reason = _aggregate(q, n)
|
||||||
|
finally:
|
||||||
|
for p in workers:
|
||||||
|
if p.is_alive():
|
||||||
|
p.terminate()
|
||||||
|
for p in workers:
|
||||||
|
p.join()
|
||||||
|
|
||||||
|
if reason != "restart":
|
||||||
|
break
|
||||||
|
print("\nRestarting workers...\n")
|
||||||
|
time.sleep(1)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# CLI entry point
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def _parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser("Multiprocess launcher for main.py miner")
|
||||||
|
parser.add_argument(
|
||||||
|
"-n", "--num-procs",
|
||||||
|
type=int, default=config.NUM_PROCESSORS,
|
||||||
|
help=f"Number of workers (default: {config.NUM_PROCESSORS})",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--base-extranonce2",
|
||||||
|
default=config.EXTRANONCE2,
|
||||||
|
help=f"Hex base for EXTRANONCE2 (default: {config.EXTRANONCE2})",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
|
||||||
|
)
|
||||||
|
|
||||||
|
mp.set_start_method("spawn", force=True)
|
||||||
|
args = _parse_args()
|
||||||
|
|
||||||
|
from rpc import test_rpc_connection
|
||||||
|
test_rpc_connection()
|
||||||
|
|
||||||
|
print(f"\nStarting mining with {args.num_procs} processes (extranonce2 base={args.base_extranonce2})\n")
|
||||||
|
launch(args.num_procs, args.base_extranonce2)
|
||||||
145
main.py
Normal file
145
main.py
Normal file
@@ -0,0 +1,145 @@
|
|||||||
|
import hashlib
|
||||||
|
import logging
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
|
||||||
|
import config
|
||||||
|
from block_builder import (
|
||||||
|
build_block_header, build_coinbase_transaction,
|
||||||
|
calculate_merkle_root, is_segwit_tx, serialize_block,
|
||||||
|
)
|
||||||
|
from miner import mine_block
|
||||||
|
from rpc import (
|
||||||
|
connect_rpc, get_best_block_hash, get_block_template,
|
||||||
|
ensure_witness_data, submit_block, test_rpc_connection,
|
||||||
|
)
|
||||||
|
from utils import calculate_target, watchdog_bestblock
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def _prepare_template(rpc) -> dict | None:
|
||||||
|
"""Fetch and enrich the block template. Return None on error."""
|
||||||
|
template = get_block_template(rpc)
|
||||||
|
if not template:
|
||||||
|
return None
|
||||||
|
ensure_witness_data(rpc, template)
|
||||||
|
tot_tx = len(template["transactions"])
|
||||||
|
witness_tx = sum(1 for tx in template["transactions"] if is_segwit_tx(tx["data"]))
|
||||||
|
log.info(
|
||||||
|
"Template height=%d total tx=%d legacy=%d segwit=%d",
|
||||||
|
template["height"], tot_tx, tot_tx - witness_tx, witness_tx,
|
||||||
|
)
|
||||||
|
return template
|
||||||
|
|
||||||
|
|
||||||
|
def main(
|
||||||
|
event_queue=None,
|
||||||
|
worker_idx: int = 0,
|
||||||
|
extranonce2: str | None = None,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Main mining loop.
|
||||||
|
|
||||||
|
Optional parameters for multiprocess usage via launcher:
|
||||||
|
event_queue - queue used to send structured events to supervisor
|
||||||
|
worker_idx - worker index (for event identification)
|
||||||
|
extranonce2 - worker-specific extranonce2 value
|
||||||
|
"""
|
||||||
|
extranonce2 = extranonce2 or config.EXTRANONCE2
|
||||||
|
|
||||||
|
test_rpc_connection()
|
||||||
|
log.info("Extranonce2: %s | Coinbase: %s", extranonce2, config.COINBASE_MESSAGE)
|
||||||
|
|
||||||
|
# Main connection reused for the whole process lifecycle
|
||||||
|
rpc = connect_rpc()
|
||||||
|
|
||||||
|
# Fetch chain and scriptPubKey once - they do not change across cycles
|
||||||
|
network = rpc.getblockchaininfo().get("chain", "")
|
||||||
|
miner_script = rpc.getaddressinfo(config.WALLET_ADDRESS)["scriptPubKey"]
|
||||||
|
log.info("Chain: %s", network)
|
||||||
|
|
||||||
|
def _on_status(attempts: int, hashrate: float) -> None:
|
||||||
|
if event_queue is not None:
|
||||||
|
event_queue.put(("status", worker_idx, {"rate": hashrate / 1000, "attempts": attempts}))
|
||||||
|
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
log.info("=== New mining cycle ===")
|
||||||
|
|
||||||
|
# STEP 1-3: template
|
||||||
|
template = _prepare_template(rpc)
|
||||||
|
if not template:
|
||||||
|
log.error("Unable to fetch template. Retrying in 5s...")
|
||||||
|
time.sleep(5)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# STEP 4: coinbase
|
||||||
|
coinbase_tx, coinbase_txid = build_coinbase_transaction(
|
||||||
|
template, miner_script,
|
||||||
|
config.EXTRANONCE1, extranonce2,
|
||||||
|
config.COINBASE_MESSAGE,
|
||||||
|
)
|
||||||
|
|
||||||
|
# STEP 5-7: target, Merkle root, header
|
||||||
|
modified_target = calculate_target(template, config.DIFFICULTY_FACTOR, network)
|
||||||
|
merkle_root = calculate_merkle_root(coinbase_txid, template["transactions"])
|
||||||
|
header_hex = build_block_header(
|
||||||
|
template["version"], template["previousblockhash"],
|
||||||
|
merkle_root, template["curtime"], template["bits"], 0,
|
||||||
|
)
|
||||||
|
|
||||||
|
# STEP 8: start watchdog and mining
|
||||||
|
stop_event = threading.Event()
|
||||||
|
new_block_event = threading.Event()
|
||||||
|
rpc_watch = connect_rpc()
|
||||||
|
t_watch = threading.Thread(
|
||||||
|
target=watchdog_bestblock,
|
||||||
|
args=(rpc_watch, stop_event, new_block_event, get_best_block_hash),
|
||||||
|
daemon=True,
|
||||||
|
)
|
||||||
|
t_watch.start()
|
||||||
|
|
||||||
|
mined_header_hex, nonce, hashrate = mine_block(
|
||||||
|
header_hex, modified_target, config.NONCE_MODE, stop_event, _on_status,
|
||||||
|
)
|
||||||
|
|
||||||
|
stop_event.set()
|
||||||
|
t_watch.join(timeout=0.2)
|
||||||
|
|
||||||
|
if new_block_event.is_set() or mined_header_hex is None:
|
||||||
|
log.info("Cycle interrupted: restarting with updated template")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# STEP 9: block hash and supervisor notification
|
||||||
|
header_bytes = bytes.fromhex(mined_header_hex)
|
||||||
|
block_hash = hashlib.sha256(hashlib.sha256(header_bytes).digest()).digest()[::-1].hex()
|
||||||
|
log.info("Found block hash: %s", block_hash)
|
||||||
|
|
||||||
|
if event_queue is not None:
|
||||||
|
event_queue.put(("found", worker_idx, {"rate": hashrate / 1000 if hashrate else 0}))
|
||||||
|
event_queue.put(("hash", worker_idx, block_hash))
|
||||||
|
|
||||||
|
# STEP 10: serialize and submit
|
||||||
|
serialized_block = serialize_block(mined_header_hex, coinbase_tx, template["transactions"])
|
||||||
|
if not serialized_block:
|
||||||
|
log.error("Block serialization failed. Retrying...")
|
||||||
|
continue
|
||||||
|
|
||||||
|
submit_block(rpc, serialized_block)
|
||||||
|
|
||||||
|
if event_queue is not None:
|
||||||
|
event_queue.put(("submit", worker_idx, None))
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
log.exception("Error in mining cycle")
|
||||||
|
|
||||||
|
time.sleep(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
|
||||||
|
)
|
||||||
|
main()
|
||||||
140
miner.py
Normal file
140
miner.py
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
import random
|
||||||
|
import struct
|
||||||
|
import time
|
||||||
|
import hashlib
|
||||||
|
import logging
|
||||||
|
from binascii import hexlify, unhexlify
|
||||||
|
from threading import Event
|
||||||
|
from typing import Callable
|
||||||
|
|
||||||
|
import config
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# Seconds between hashrate logs
|
||||||
|
_RATE_INT = 2
|
||||||
|
|
||||||
|
|
||||||
|
def _compute_hash_batch(
|
||||||
|
header_76: bytes,
|
||||||
|
start_nonce: int,
|
||||||
|
batch_size: int,
|
||||||
|
target_be: bytes,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Compute hashes for a nonce batch and return the first valid nonce found.
|
||||||
|
|
||||||
|
Optimizations:
|
||||||
|
- Precompute first chunk (64 bytes) with sha256.copy() to avoid
|
||||||
|
re-updating the invariant portion on each nonce.
|
||||||
|
- Preallocate a 16-byte tail and update only the nonce field
|
||||||
|
with struct.pack_into to minimize allocations.
|
||||||
|
- Local function binding to reduce lookups in the hot loop.
|
||||||
|
"""
|
||||||
|
first_chunk = header_76[:64]
|
||||||
|
tail_static = header_76[64:] # 12 bytes: merkle[28:32] + ts(4) + bits(4)
|
||||||
|
|
||||||
|
sha_base = hashlib.sha256()
|
||||||
|
sha_base.update(first_chunk)
|
||||||
|
|
||||||
|
tail = bytearray(16)
|
||||||
|
tail[0:12] = tail_static
|
||||||
|
|
||||||
|
sha_copy = sha_base.copy
|
||||||
|
sha256 = hashlib.sha256
|
||||||
|
pack_into = struct.pack_into
|
||||||
|
|
||||||
|
for i in range(batch_size):
|
||||||
|
n = (start_nonce + i) & 0xFFFFFFFF
|
||||||
|
pack_into("<I", tail, 12, n)
|
||||||
|
|
||||||
|
ctx = sha_copy()
|
||||||
|
ctx.update(tail)
|
||||||
|
digest = sha256(ctx.digest()).digest()
|
||||||
|
|
||||||
|
if digest[::-1] < target_be:
|
||||||
|
return n, digest
|
||||||
|
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
|
||||||
|
def mine_block(
|
||||||
|
header_hex: str,
|
||||||
|
target_hex: str,
|
||||||
|
nonce_mode: str = "incremental",
|
||||||
|
stop_event: Event | None = None,
|
||||||
|
status_callback: Callable | None = None,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Run mining by iterating nonces until a valid hash is found or stop_event is received.
|
||||||
|
|
||||||
|
Calls status_callback(attempts, hashrate_hz) every ~2 seconds if provided.
|
||||||
|
Returns (mined_header_hex, nonce, hashrate) or (None, None, None) if stopped.
|
||||||
|
"""
|
||||||
|
log.info("Starting mining - mode %s", nonce_mode)
|
||||||
|
|
||||||
|
if nonce_mode not in ("incremental", "random", "mixed"):
|
||||||
|
raise ValueError(f"Invalid mining mode: {nonce_mode!r}")
|
||||||
|
|
||||||
|
version = unhexlify(header_hex[0:8])
|
||||||
|
prev_hash = unhexlify(header_hex[8:72])
|
||||||
|
merkle = unhexlify(header_hex[72:136])
|
||||||
|
ts_bytes = unhexlify(header_hex[136:144])
|
||||||
|
bits = unhexlify(header_hex[144:152])
|
||||||
|
|
||||||
|
header_76 = version + prev_hash + merkle + ts_bytes + bits
|
||||||
|
target_be = int(target_hex, 16).to_bytes(32, "big")
|
||||||
|
|
||||||
|
nonce = 0 if nonce_mode == "incremental" else random.randint(0, 0xFFFFFFFF)
|
||||||
|
|
||||||
|
# Read configuration once before entering the loop
|
||||||
|
batch_size = config.BATCH
|
||||||
|
ts_interval = config.TIMESTAMP_UPDATE_INTERVAL
|
||||||
|
|
||||||
|
attempts = 0
|
||||||
|
start_t = time.time()
|
||||||
|
last_rate_t = start_t
|
||||||
|
last_rate_n = 0
|
||||||
|
last_tsu = start_t
|
||||||
|
|
||||||
|
while True:
|
||||||
|
if stop_event is not None and stop_event.is_set():
|
||||||
|
log.info("Mining stopped: stop_event received")
|
||||||
|
return None, None, None
|
||||||
|
|
||||||
|
now = time.time()
|
||||||
|
|
||||||
|
# Periodic timestamp update in block header
|
||||||
|
if ts_interval and (now - last_tsu) >= ts_interval:
|
||||||
|
ts_bytes = struct.pack("<I", int(now))
|
||||||
|
header_76 = version + prev_hash + merkle + ts_bytes + bits
|
||||||
|
last_tsu = now
|
||||||
|
|
||||||
|
found_nonce, digest = _compute_hash_batch(header_76, nonce, batch_size, target_be)
|
||||||
|
|
||||||
|
if found_nonce is not None:
|
||||||
|
total = time.time() - start_t
|
||||||
|
hashrate = (attempts + batch_size) / total if total else 0
|
||||||
|
full_header = header_76 + struct.pack("<I", found_nonce)
|
||||||
|
log.info(
|
||||||
|
"Block found - nonce=%d attempts=%d time=%.2fs hashrate=%.2f kH/s",
|
||||||
|
found_nonce, attempts + batch_size, total, hashrate / 1000,
|
||||||
|
)
|
||||||
|
log.info("Valid hash: %s", digest[::-1].hex())
|
||||||
|
return hexlify(full_header).decode(), found_nonce, hashrate
|
||||||
|
|
||||||
|
attempts += batch_size
|
||||||
|
nonce = (nonce + batch_size) & 0xFFFFFFFF
|
||||||
|
|
||||||
|
# Periodic logs and callback
|
||||||
|
now = time.time()
|
||||||
|
if now - last_rate_t >= _RATE_INT:
|
||||||
|
hashrate = (attempts - last_rate_n) / (now - last_rate_t)
|
||||||
|
last_rate_t = now
|
||||||
|
last_rate_n = attempts
|
||||||
|
log.info(
|
||||||
|
"Mining status - hashrate=%.2f kH/s attempts=%d nonce=%d",
|
||||||
|
hashrate / 1000, attempts, nonce,
|
||||||
|
)
|
||||||
|
if status_callback:
|
||||||
|
status_callback(attempts, hashrate)
|
||||||
4
requirements.txt
Normal file
4
requirements.txt
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
python-bitcoinrpc==1.0
|
||||||
|
python-dotenv==1.0.0
|
||||||
|
numpy
|
||||||
|
numba
|
||||||
101
rpc.py
Normal file
101
rpc.py
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
import logging
|
||||||
|
|
||||||
|
from bitcoinrpc.authproxy import AuthServiceProxy
|
||||||
|
|
||||||
|
import config
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def connect_rpc() -> AuthServiceProxy:
|
||||||
|
"""Create an RPC connection to the Bitcoin node."""
|
||||||
|
return AuthServiceProxy(
|
||||||
|
f"http://{config.RPC_USER}:{config.RPC_PASSWORD}@{config.RPC_HOST}:{config.RPC_PORT}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_rpc_connection() -> None:
|
||||||
|
"""Check the connection and show basic blockchain information."""
|
||||||
|
log.info("Checking RPC connection")
|
||||||
|
try:
|
||||||
|
info = connect_rpc().getblockchaininfo()
|
||||||
|
log.info(
|
||||||
|
"RPC connection successful - chain=%s, blocks=%d, difficulty=%s",
|
||||||
|
info["chain"], info["blocks"], info["difficulty"],
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
log.exception("RPC connection error")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def get_best_block_hash(rpc) -> str | None:
|
||||||
|
"""Fetch the most recent block hash."""
|
||||||
|
try:
|
||||||
|
h = rpc.getbestblockhash()
|
||||||
|
log.debug("Best block hash: %s", h)
|
||||||
|
return h
|
||||||
|
except Exception as e:
|
||||||
|
log.error("RPC error getbestblockhash: %s", e)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_block_template(rpc) -> dict | None:
|
||||||
|
"""Request a block template with SegWit support."""
|
||||||
|
try:
|
||||||
|
tpl = rpc.getblocktemplate({"rules": ["segwit"]})
|
||||||
|
log.debug("Template received - height=%d, tx=%d", tpl["height"], len(tpl["transactions"]))
|
||||||
|
return tpl
|
||||||
|
except Exception as e:
|
||||||
|
log.error("RPC error getblocktemplate: %s", e)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def ensure_witness_data(rpc, template: dict) -> None:
|
||||||
|
"""
|
||||||
|
Enrich template transactions with full witness data.
|
||||||
|
Use a single HTTP batch call to reduce latency versus N single requests.
|
||||||
|
"""
|
||||||
|
txs = template["transactions"]
|
||||||
|
if not txs:
|
||||||
|
return
|
||||||
|
|
||||||
|
# JSON-RPC batch call: one HTTP request for all transactions
|
||||||
|
try:
|
||||||
|
batch = [["getrawtransaction", tx["txid"], False] for tx in txs]
|
||||||
|
results = rpc._batch(batch)
|
||||||
|
raw_map = {
|
||||||
|
txs[r["id"]]["txid"]: r["result"]
|
||||||
|
for r in results
|
||||||
|
if r.get("result") is not None
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
log.warning("RPC batch unavailable, falling back to single calls: %s", e)
|
||||||
|
raw_map = {}
|
||||||
|
for tx in txs:
|
||||||
|
try:
|
||||||
|
raw = rpc.getrawtransaction(tx["txid"], False)
|
||||||
|
if raw:
|
||||||
|
raw_map[tx["txid"]] = raw
|
||||||
|
except Exception as e2:
|
||||||
|
log.debug("Missing raw witness for %s: %s", tx["txid"], e2)
|
||||||
|
|
||||||
|
template["transactions"] = [
|
||||||
|
{"hash": tx["txid"], "data": raw_map.get(tx["txid"], tx["data"])}
|
||||||
|
for tx in txs
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def submit_block(rpc, serialized_block: str) -> None:
|
||||||
|
"""Submit the mined block to the Bitcoin node."""
|
||||||
|
log.info("Submitting serialized block (%d bytes) to node", len(serialized_block) // 2)
|
||||||
|
if not serialized_block:
|
||||||
|
log.error("Block not serialized correctly - submission canceled")
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
result = rpc.submitblock(serialized_block)
|
||||||
|
if result is None:
|
||||||
|
log.info("Block accepted into the blockchain")
|
||||||
|
else:
|
||||||
|
log.error("submitblock returned: %s", result)
|
||||||
|
except Exception:
|
||||||
|
log.exception("RPC error during submitblock")
|
||||||
86
utils.py
Normal file
86
utils.py
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
"""Common utility module with shared functions for the miner project."""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import logging
|
||||||
|
import threading
|
||||||
|
|
||||||
|
import config
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def double_sha256(data: bytes) -> bytes:
|
||||||
|
"""Compute double SHA-256 for the given data."""
|
||||||
|
return hashlib.sha256(hashlib.sha256(data).digest()).digest()
|
||||||
|
|
||||||
|
|
||||||
|
def encode_varint(value: int) -> str:
|
||||||
|
"""Encode a number as VarInt according to the Bitcoin protocol."""
|
||||||
|
if value < 0xFD:
|
||||||
|
return value.to_bytes(1, "little").hex()
|
||||||
|
if value <= 0xFFFF:
|
||||||
|
return "fd" + value.to_bytes(2, "little").hex()
|
||||||
|
if value <= 0xFFFFFFFF:
|
||||||
|
return "fe" + value.to_bytes(4, "little").hex()
|
||||||
|
if value <= 0xFFFFFFFFFFFFFFFF:
|
||||||
|
return "ff" + value.to_bytes(8, "little").hex()
|
||||||
|
raise ValueError("Value exceeds maximum VarInt limit (2^64-1)")
|
||||||
|
|
||||||
|
|
||||||
|
def decode_nbits(nBits: int) -> str:
|
||||||
|
"""Decode the nBits field into a 256-bit target in hexadecimal format."""
|
||||||
|
exponent = (nBits >> 24) & 0xFF
|
||||||
|
significand = nBits & 0x007FFFFF
|
||||||
|
return f"{(significand << (8 * (exponent - 3))):064x}"
|
||||||
|
|
||||||
|
|
||||||
|
def calculate_target(template, difficulty_factor: float, network: str) -> str:
|
||||||
|
"""Compute a modified target based on network and difficulty factor."""
|
||||||
|
if network == "regtest":
|
||||||
|
if difficulty_factor < 0:
|
||||||
|
difficulty_factor = 0.1
|
||||||
|
else:
|
||||||
|
difficulty_factor = 1.0
|
||||||
|
|
||||||
|
nBits_int = int(template["bits"], 16)
|
||||||
|
original_target = decode_nbits(nBits_int)
|
||||||
|
|
||||||
|
if difficulty_factor == 0:
|
||||||
|
return original_target
|
||||||
|
|
||||||
|
max_target = 0x00000000FFFF0000000000000000000000000000000000000000000000000000
|
||||||
|
target_value = int(max_target / difficulty_factor)
|
||||||
|
return f"{min(target_value, (1 << 256) - 1):064x}"
|
||||||
|
|
||||||
|
|
||||||
|
def watchdog_bestblock(
|
||||||
|
rpc_conn,
|
||||||
|
stop_event: threading.Event,
|
||||||
|
new_block_event: threading.Event,
|
||||||
|
get_best_block_hash_func,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Periodically check if there is a new best block.
|
||||||
|
When detected, set new_block_event and stop_event
|
||||||
|
to interrupt current mining within the next batch.
|
||||||
|
"""
|
||||||
|
log.info("Watchdog started.")
|
||||||
|
try:
|
||||||
|
last_hash = get_best_block_hash_func(rpc_conn)
|
||||||
|
except Exception as e:
|
||||||
|
log.error("Watchdog: unable to get initial hash: %s", e)
|
||||||
|
return
|
||||||
|
|
||||||
|
while not stop_event.wait(config.CHECK_INTERVAL):
|
||||||
|
try:
|
||||||
|
new_hash = get_best_block_hash_func(rpc_conn)
|
||||||
|
if new_hash and new_hash != last_hash:
|
||||||
|
log.info("New best block: %s", new_hash)
|
||||||
|
last_hash = new_hash
|
||||||
|
new_block_event.set()
|
||||||
|
stop_event.set() # interrupt current miner
|
||||||
|
return
|
||||||
|
except Exception as e:
|
||||||
|
log.error("Watchdog error: %s", e)
|
||||||
|
|
||||||
|
log.info("Watchdog stopped.")
|
||||||
Reference in New Issue
Block a user