Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,57 +1,78 @@
# Blockchain RPC Exporter

The exporter is used to scrape metrics from blockchain RPC endpoints. The purpose of this exporter is to perform black-box testing on RPC endpoints.

## Metrics

Exporter currently supports all EVM-compatible chains. In addition, there is limited support for the following chains:

- Cardano (wss)
- Conflux (wss)
- Solana (https & wss)
- Bitcoin (https)
- Dogecoin (https)
- Filecoin (https)
- Starknet (https)
- Aptos (https)
- XRPL (https)

## Available Metrics

# Disclaimer

Please note that this tool is in the early development stage and should not be used to influence critical business decisions.
The project in its current form suits our short-term needs and will receive limited support. We encourage you to fork the project and extend it with additional functionality you might need.

## Development

You should install [pre-commit](https://pre-commit.com/) so that automated linting and formatting checks are performed before each commit.

Run:

```bash
pip install pre-commit
pre-commit install
```

### Running locally

1. Make sure you have python3 installed (>3.11)
2. Set up your python environment

```bash
pip3 install virtualenv
virtualenv venv
source venv/bin/activate
pip install -r requirements.txt
pip install -r requirements-dev.txt
```

1. Generate valid exporter config and validation file. For example see [config example](config/exporter_example/config.yml) and [validation example](config/exporter_example/validation.yml).
2. Export paths of generated configuration files relative to `src/exporter.py`:

```bash
export VALIDATION_FILE_PATH="validation.yml" # For example if we saved validation config file in src/validation.yml
export CONFIG_FILE_PATH="config.yml" # For example if we saved config file in src/config.yml
```

3. Finally you can run the exporter

```bash
python exporter.py
```

### Run with docker-compose

1. Generate valid exporter config and validation file. For example see [config example](config/exporter_example/config.yml) and [validation example](config/exporter_example/validation.yml).
2. Export paths of generated configuration files relative to `docker-compose.yml`:

```bash
export VALIDATION_FILE_PATH="src/validation.yml" # For example if we saved validation config file in src/validation.yml
export CONFIG_FILE_PATH="src/config.yml" # For example if we saved config file in src/config.yml
```

3. Execute

```bash
docker-compose build
docker-compose up
Expand All @@ -61,28 +82,39 @@ curl localhost:9090 # Prometheus
```

### Testing

Testing is performed using [pytest](https://docs.pytest.org/) run by [coverage.py](https://coverage.readthedocs.io/) to generate test coverage reporting.
[pylint](https://pylint.readthedocs.io/) is used to lint the pyhton code.
These dependencies can be found in the [requirements-dev.txt](requirements-dev.txt) file. Unit testing and linting is performed on every commit push to the repository. 90% test coverage and no linter errors/warnings are a requirement for the tests to pass.

#### Testing Locally (venv)

Tests can be run locally in the virtual environment.

1. Run the unit tests with coverage.py from within the `src` directory.

```bash
coverage run --branch -m pytest
```

2. Generate the coverage report. To view the report open the generated `index.html` file in a browser.

```bash
coverage html
```

3. Run the linter to find any errors/warnings.

```bash
pylint src/*py
```

#### Testing Locally (docker)

The tests and linter can be run using docker by building the `test` docker stage.

1. Build the `test` stage in the `Dockerfile`.

```bash
docker build --target test .
```
70 changes: 70 additions & 0 deletions src/collectors.py
Original file line number Diff line number Diff line change
Expand Up @@ -500,3 +500,73 @@ def client_version(self):
def latency(self):
"""Returns connection latency."""
return self.interface.latest_query_latency


class XRPLCollector():
"""A collector to fetch information about XRP Ledger endpoints."""

def __init__(self, url, labels, chain_id, **client_parameters):
self.labels = labels
self.chain_id = chain_id
self.interface = HttpsInterface(url, client_parameters.get('open_timeout'),
client_parameters.get('ping_timeout'))
self._logger_metadata = {
'component': 'XRPLCollector',
'url': strip_url(url)
}
self.ledger_closed_payload = {
'method': 'ledger_closed',
'params': [{}] # Required empty object in params array
}
self.server_info_payload = {
'method': 'server_info',
'params': [{}] # Required empty object in params array
}

def alive(self):
"""Returns true if endpoint is alive, false if not."""
return self.interface.cached_json_rpc_post(
self.ledger_closed_payload, non_rpc_response=True) is not None

def block_height(self):
"""Returns latest block height (ledger index)."""
response = self.interface.cached_json_rpc_post(
self.ledger_closed_payload, non_rpc_response=True)
if response is None:
return None

# For XRPL, the response will be the whole JSON object
if isinstance(response, dict) and 'result' in response:
result = response['result']
return validate_dict_and_return_key_value(
result, 'ledger_index', self._logger_metadata)
return None

def client_version(self):
"""Gets build version from server_info."""
response = self.interface.cached_json_rpc_post(
self.server_info_payload, non_rpc_response=True)
if response is None:
return None

# For XRPL, the response will be the whole JSON object
if isinstance(response, dict) and 'result' in response:
result = response['result']

if 'info' in result:
info = result['info']
version = validate_dict_and_return_key_value(
info, 'build_version', self._logger_metadata, stringify=True)

# If build_version is not found, try libxrpl_version
if version is None:
version = validate_dict_and_return_key_value(
info, 'libxrpl_version', self._logger_metadata, stringify=True)

if version is not None:
return {"client_version": version}
return None

def latency(self):
"""Returns connection latency."""
return self.interface.latest_query_latency
2 changes: 1 addition & 1 deletion src/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def endpoints(self):
def _load_configuration(self):
supported_collectors = ('evm', 'evmhttp', 'cardano', 'conflux', 'solana',
'bitcoin', 'doge', 'filecoin', 'starknet', 'aptos',
'tron')
'tron', 'xrpl')

configuration_schema = Schema({
'blockchain':
Expand Down
17 changes: 12 additions & 5 deletions src/interfaces.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,26 +70,33 @@ def _return_and_validate_request(self, method='GET', payload=None, params=None):
**self._logger_metadata)
return None

def json_rpc_post(self, payload):
def json_rpc_post(self, payload, non_rpc_response=None):
"""Checks the validity of a successful json-rpc response. If any of the
validations fail, the method returns type None. """
response = self._return_and_validate_request(method='POST', payload=payload)
if response is not None:
result = return_and_validate_rpc_json_result(
response, self._logger_metadata)
# Use REST validation instead of RPC validation if non_rpc_response is True
# to handle non-RPC responses such as XRPL
if non_rpc_response:
result = return_and_validate_rest_api_json_result(
response, self._logger_metadata)
else:
result = return_and_validate_rpc_json_result(
response, self._logger_metadata)

if result is not None:
return result
return None

def cached_json_rpc_post(self, payload: dict):
def cached_json_rpc_post(self, payload: dict, non_rpc_response=None):
"""Calls json_rpc_post and stores the result in in-memory cache."""
cache_key = f"rpc:{str(payload)}"

if self.cache.is_cached(cache_key):
return_value = self.cache.retrieve_key_value(cache_key)
return return_value

value = self.json_rpc_post(payload=payload)
value = self.json_rpc_post(payload=payload, non_rpc_response=non_rpc_response)
if value is not None:
self.cache.store_key_value(cache_key, value)
return value
Expand Down
2 changes: 2 additions & 0 deletions src/registries.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,8 @@ def get_collector_registry(self) -> list:
collector = collectors.AptosCollector
case "tron", "tron":
collector = collectors.TronCollector
case "xrpl", "xrpl":
collector = collectors.XRPLCollector
case "evmhttp", other: # pylint: disable=unused-variable
collector = collectors.EvmHttpCollector
case "evm", other: # pylint: disable=unused-variable
Expand Down
109 changes: 109 additions & 0 deletions src/test_collectors.py
Original file line number Diff line number Diff line change
Expand Up @@ -811,3 +811,112 @@ def test_latency(self):
"""Tests that the latency is obtained from the interface based on latest_query_latency"""
self.mocked_connection.return_value.latest_query_latency = 0.123
self.assertEqual(0.123, self.tron_collector.latency())

class TestXRPLCollector(TestCase):
"""Tests the XRPL collector class"""

def setUp(self):
self.url = "https://test.com"
self.labels = ["dummy", "labels"]
self.chain_id = 123
self.open_timeout = 8
self.ping_timeout = 9
self.client_params = {
"open_timeout": self.open_timeout, "ping_timeout": self.ping_timeout}
with mock.patch('collectors.HttpsInterface') as mocked_connection:
self.xrpl_collector = collectors.XRPLCollector(
self.url, self.labels, self.chain_id, **self.client_params)
self.mocked_connection = mocked_connection

def test_logger_metadata(self):
"""Validate logger metadata. Makes sure url is stripped by helpers.strip_url function."""
expected_metadata = {
'component': 'XRPLCollector', 'url': 'test.com'}
self.assertEqual(expected_metadata,
self.xrpl_collector._logger_metadata)

def test_https_interface_created(self):
"""Tests that the XRPL collector calls the https interface with the correct args"""
self.mocked_connection.assert_called_once_with(
self.url, self.open_timeout, self.ping_timeout)

def test_interface_attribute_exists(self):
"""Tests that the interface attribute exists."""
self.assertTrue(hasattr(self.xrpl_collector, 'interface'))

def test_alive_call(self):
"""Tests the alive function uses the correct call"""
self.xrpl_collector.alive()
self.mocked_connection.return_value.cached_json_rpc_post.assert_called_once_with(
self.xrpl_collector.ledger_closed_payload, non_rpc_response=True)

def test_alive_false(self):
"""Tests the alive function returns false when post returns None"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = None
result = self.xrpl_collector.alive()
self.assertFalse(result)

def test_block_height(self):
"""Tests the block_height function uses the correct call to get block height"""
self.xrpl_collector.block_height()
self.mocked_connection.return_value.cached_json_rpc_post.assert_called_once_with(
self.xrpl_collector.ledger_closed_payload, non_rpc_response=True)

def test_block_height_get_ledger_index(self):
"""Tests that the block height is returned with the ledger_index key"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = {
"result": {"ledger_index": 96217031}}
result = self.xrpl_collector.block_height()
self.assertEqual(96217031, result)

def test_block_height_key_error_returns_none(self):
"""Tests that the block height returns None on KeyError"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = {
"result": {"dummy_key": 5}}
result = self.xrpl_collector.block_height()
self.assertEqual(None, result)

def test_block_height_returns_none(self):
"""Tests that the block height returns None if json_rpc_post returns None"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = None
result = self.xrpl_collector.block_height()
self.assertEqual(None, result)

def test_client_version(self):
"""Tests the client_version function uses the correct call to get client version"""
self.xrpl_collector.client_version()
self.mocked_connection.return_value.cached_json_rpc_post.assert_called_once_with(
self.xrpl_collector.server_info_payload, non_rpc_response=True)

def test_client_version_get_build_version(self):
"""Tests that the client version is returned with the build_version key"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = {
"result": {"info": {"build_version": "2.4.0"}}}
result = self.xrpl_collector.client_version()
self.assertEqual({"client_version": "2.4.0"}, result)

def test_client_version_get_libxrpl_version(self):
"""Tests that the client version is returned with the libxrpl_version key
if build_version is not present"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = {
"result": {"info": {"libxrpl_version": "2.4.0"}}}
result = self.xrpl_collector.client_version()
self.assertEqual({"client_version": "2.4.0"}, result)

def test_client_version_key_error_returns_none(self):
"""Tests that the client_version returns None on KeyError"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = {
"result": {"info": {"dummy_key": "value"}}}
result = self.xrpl_collector.client_version()
self.assertEqual(None, result)

def test_client_version_returns_none(self):
"""Tests that the client_version returns None if json_rpc_post returns None"""
self.mocked_connection.return_value.cached_json_rpc_post.return_value = None
result = self.xrpl_collector.client_version()
self.assertEqual(None, result)

def test_latency(self):
"""Tests that the latency is obtained from the interface based on latest_query_latency"""
self.mocked_connection.return_value.latest_query_latency = 0.123
self.assertEqual(0.123, self.xrpl_collector.latency())
10 changes: 10 additions & 0 deletions src/test_registries.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,6 +157,16 @@ def test_get_collector_registry_for_tron(self):
with mock.patch('collectors.TronCollector', new=mock.Mock()) as collector:
helper_test_collector_registry(self, collector)

@mock.patch.dict(os.environ, {
"CONFIG_FILE_PATH": "tests/fixtures/configuration_xrpl.yaml",
"VALIDATION_FILE_PATH": "tests/fixtures/validation.yaml"
})
def test_get_collector_registry_for_xrpl(self):
"""Tests that the XRPL collector is called with the correct args"""
self.collector_registry = CollectorRegistry()
with mock.patch('collectors.XRPLCollector', new=mock.Mock()) as collector:
helper_test_collector_registry(self, collector)

@mock.patch.dict(os.environ, {
"CONFIG_FILE_PATH": "tests/fixtures/configuration_evmhttp.yaml",
"VALIDATION_FILE_PATH": "tests/fixtures/validation.yaml"
Expand Down
15 changes: 15 additions & 0 deletions src/tests/fixtures/configuration_xrpl.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
blockchain: "xrpl"
chain_id: 1234
network_name: "Testnet"
network_type: "Testnet"
integration_maturity: "development"
canonical_name: "test-network-testnet"
chain_selector: 121212
collector: "xrpl"
endpoints:
- url: https://test1.com
provider: TestProvider1
- url: https://test2.com
provider: TestProvider2
- url: https://test3.com
provider: TestProvider3