You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Oct 20, 2021. It is now read-only.
> Vulcanize DB is a set of tools that make it easier for developers to write application-specific indexes and caches for dapps built on Ethereum.
5
+
> Tool for syncing Ethereum headers into a Postgres database
7
6
8
-
9
-
## Table of Contents
7
+
## Table of Contents
10
8
1.[Background](#background)
11
9
1.[Install](#install)
12
10
1.[Usage](#usage)
13
11
1.[Contributing](#contributing)
14
12
1.[License](#license)
15
13
16
-
17
14
## Background
18
-
The same data structures and encodings that make Ethereum an effective and trust-less distributed virtual machine
19
-
complicate data accessibility and usability for dApp developers. VulcanizeDB improves Ethereum data accessibility by
20
-
providing a suite of tools to ease the extraction and transformation of data into a more useful state, including
21
-
allowing for exposing aggregate data from a suite of smart contracts.
15
+
Ethereum data is natively stored in key-value databases such as leveldb (geth) and rocksdb (openethereum).
16
+
Storage of Ethereum in these KV databases is optimized for scalability and ideal for the purposes of consensus,
17
+
it is not necessarily ideal for use by external applications. Moving Ethereum data into a relational database can provide
18
+
many advantages for downstream data consumers.
19
+
20
+
This tool syncs Ethereum headers in Postgres. Addtionally, it validates headers from the last 15 blocks to ensure the data is up to date and
21
+
handles chain reorgs by validating the most recent blocks' hashes and upserting invalid header records.
22
22
23
-
VulanizeDB includes processes that sync, transform and expose data. Syncing involves
24
-
querying an Ethereum node and then persisting core data into a Postgres database. Transforming focuses on using previously synced data to
25
-
query for and transform log event and storage data for specifically configured smart contract addresses. Exposing data is a matter of getting
26
-
data from VulcanizeDB's underlying Postgres database and making it accessible.
23
+
This is useful when you want a minimal baseline from which to track and hash-link targeted data on the blockchain (e.g. individual smart contract storage values or event logs).
24
+
Some examples of this are the [eth-contract-watcher]() and [eth-account-watcher]().
25
+
26
+
Headers are fetched by RPC queries to the standard `eth_getBlockByNumber` JSON-RPC endpoint, headers can be synced from anything
Data transformation uses the raw data that has been synced into Postgres to filter out and apply transformations to
118
-
specific data of interest. Since there are different types of data that may be useful for observing smart contracts, it
119
-
follows that there are different ways to transform this data. We've started by categorizing this into Generic and
120
-
Custom transformers:
107
+
The config file must be formatted as follows, and should contain an RPC path to a running Ethereum node:
121
108
122
-
- Generic Contract Transformer: Generic contract transformation can be done using a built-in command,
123
-
`contractWatcher`, which transforms contract events provided the contract's ABI is available. It also
124
-
provides some state variable coverage by automating polling of public methods, with some restrictions.
125
-
`contractWatcher` is described further [here](documentation/generic-transformer.md).
109
+
```toml
110
+
[database]
111
+
name = "vulcanize_public"
112
+
hostname = "localhost"
113
+
user = "postgres"
114
+
password = ""
115
+
port = 5432
126
116
127
-
- Custom Transformers: In many cases custom transformers will need to be written to provide
128
-
more comprehensive coverage of contract data. In this case we have provided the `compose`, `execute`, and
129
-
`composeAndExecute` commands for running custom transformers from external repositories. Documentation on how to write,
130
-
build and run custom transformers as Go plugins can be found
131
-
[here](documentation/custom-transformers.md).
117
+
[client]
118
+
rpcPath = "http://127.0.0.1:8545"
119
+
```
132
120
133
-
### Exposing the data
134
-
[Postgraphile](https://www.graphile.org/postgraphile/) is used to expose GraphQL endpoints for our database schemas, this is described in detail [here](documentation/postgraphile.md).
135
121
122
+
## Maintainers
123
+
@vulcanize
124
+
@AFDudley
125
+
@i-norden
136
126
137
-
### Tests
138
-
- Replace the empty `ipcPath` in the `environments/testing.toml` with a path to a full node's eth_jsonrpc endpoint (e.g. local geth node ipc path or infura url)
139
-
- Note: must be mainnet
140
-
- Note: integration tests require configuration with an archival node
141
-
-`make test` will run the unit tests and skip the integration tests
142
-
-`make integrationtest` will run just the integration tests
143
-
-`make test` and `make integrationtest` setup a clean `vulcanize_testing` db
144
127
145
128
146
129
## Contributing
147
130
Contributions are welcome!
148
131
149
132
VulcanizeDB follows the [Contributor Covenant Code of Conduct](https://www.contributor-covenant.org/version/1/4/code-of-conduct).
150
133
151
-
For more information on contributing, please see [here](documentation/contributing.md).
0 commit comments