diff --git a/.ai/pages/develop-interoperability-xcm-guides.md b/.ai/pages/develop-interoperability-xcm-guides.md index 0b3833a56..a14d646ae 100644 --- a/.ai/pages/develop-interoperability-xcm-guides.md +++ b/.ai/pages/develop-interoperability-xcm-guides.md @@ -20,28 +20,24 @@ Whether you're building applications that need to interact with multiple chains,

Send XCM Messages

-

Learn the fundamentals of sending cross-chain messages using XCM, including message structure, routing, and execution patterns.

XCM Configuration

-

Learn how to configure XCM for your chain.

Test and Debug

-

Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution.

XCM Channels

-

Learn how to configure XCM channels for your chain.

diff --git a/.ai/pages/develop-interoperability.md b/.ai/pages/develop-interoperability.md index 4325912a3..49b1616b5 100644 --- a/.ai/pages/develop-interoperability.md +++ b/.ai/pages/develop-interoperability.md @@ -21,28 +21,24 @@ This section covers everything you need to know about building and implementing

Review the Polkadot SDK's XCM Documentation

-

Dive into the official documentation to learn about the key components for supporting XCM in your parachain and enabling seamless cross-chain communication.

Follow Step-by-Step Tutorials

-

Enhance your XCM skills with step-by-step tutorials on building interoperability solutions on Polkadot SDK-based blockchains.

Familiarize Yourself with the XCM Format

-

Gain a deeper understanding of the XCM format and structure, including any extra data it may need and what each part of a message means.

Essential XCM Tools

-

Explore essential tools for creating and integrating cross-chain solutions within the Polkadot ecosystem.

diff --git a/.ai/pages/develop-parachains-customize-parachain.md b/.ai/pages/develop-parachains-customize-parachain.md index 86fdf0835..0c37b49c4 100644 --- a/.ai/pages/develop-parachains-customize-parachain.md +++ b/.ai/pages/develop-parachains-customize-parachain.md @@ -20,21 +20,18 @@ The [FRAME directory](https://github.com/paritytech/polkadot-sdk/tree/polkadot-s

FRAME Repository

-

View the source code of the FRAME development environment that provides pallets you can use, modify, and extend to build the runtime logic to suit the needs of your blockchain.

FRAME Rust docs

-

Check out the Rust documentation for FRAME, Substrate's preferred framework for building runtimes.

Polkadot SDK Best Practices

-

Understand and address common issues that can arise in blockchain development when building with the Polkadot SDK.

diff --git a/.ai/pages/develop-parachains-deployment.md b/.ai/pages/develop-parachains-deployment.md index bd9dae5d0..6cb349454 100644 --- a/.ai/pages/develop-parachains-deployment.md +++ b/.ai/pages/develop-parachains-deployment.md @@ -71,7 +71,6 @@ flowchart TD

Check Out the Chain Spec Builder Docs

-

Learn about Substrate’s chain spec builder utility.

diff --git a/.ai/pages/develop-parachains-maintenance.md b/.ai/pages/develop-parachains-maintenance.md index 1cc2392b7..4949e2b06 100644 --- a/.ai/pages/develop-parachains-maintenance.md +++ b/.ai/pages/develop-parachains-maintenance.md @@ -18,14 +18,12 @@ Learn how to maintain Polkadot SDK-based networks, focusing on runtime monitorin

Single Block Migration Example

-

Check out an example pallet demonstrating best practices for writing single-block migrations while upgrading pallet storage.

Client Telemetry Crate

-

Check out the docs on Substrate's client telemetry, a part of Substrate that allows ingesting telemetry data with, for example, Polkadot telemetry.

diff --git a/.ai/pages/develop-parachains-testing.md b/.ai/pages/develop-parachains-testing.md index 45fc04202..350d47b02 100644 --- a/.ai/pages/develop-parachains-testing.md +++ b/.ai/pages/develop-parachains-testing.md @@ -25,14 +25,12 @@ Through these guides, you'll learn to:

`sp_runtime` crate Rust docs

-

Learn about Substrate Runtime primitives that enable communication between a Substrate blockchain's runtime and client.

Moonwall Testing Framework

-

Moonwall is a comprehensive blockchain test framework for Substrate-based networks.

diff --git a/.ai/pages/develop-toolkit-api-libraries.md b/.ai/pages/develop-toolkit-api-libraries.md index 1e84ff250..2e9c39647 100644 --- a/.ai/pages/develop-toolkit-api-libraries.md +++ b/.ai/pages/develop-toolkit-api-libraries.md @@ -18,14 +18,12 @@ Explore the powerful API libraries designed for interacting with the Polkadot ne

Understand Chain Data

-

Familiarize yourself with the data provided by the APIs, including available calls, events, types, and storage items.

Network Configurations

-

Obtain the necessary configurations and WSS endpoints to interact with the APIs on Polkadot networks.

diff --git a/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md b/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md index d08a5a711..ac175c009 100644 --- a/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md +++ b/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md @@ -28,14 +28,12 @@ Whether you're debugging an issue, testing new features, or exploring cross-chai

Chopsticks Repository

-

View the official Chopsticks Github Repository. Check out the code, check out sample commands, and track issues and new releases.

Fork Live Chains with Chopsticks

-

Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI.

diff --git a/.ai/pages/develop-toolkit-parachains-fork-chains.md b/.ai/pages/develop-toolkit-parachains-fork-chains.md index 5e1e5d70f..c9ad1ec94 100644 --- a/.ai/pages/develop-toolkit-parachains-fork-chains.md +++ b/.ai/pages/develop-toolkit-parachains-fork-chains.md @@ -29,7 +29,6 @@ Forking a live chain creates a controlled environment that mirrors live network

Step-by-Step Tutorial on Forking Live Chains with Chopsticks

-

This tutorial walks you through how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM execution.

diff --git a/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md b/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md index ecd4913e2..4df40939d 100644 --- a/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md +++ b/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md @@ -27,7 +27,6 @@ Whether you're building a new parachain or testing runtime upgrades, Zombienet p

Spawn a Chain with Zombienet Tutorial

-

Follow step-by-step instructions to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

diff --git a/.ai/pages/develop-toolkit-parachains-spawn-chains.md b/.ai/pages/develop-toolkit-parachains-spawn-chains.md index 3cb1a73d6..83f364a14 100644 --- a/.ai/pages/develop-toolkit-parachains-spawn-chains.md +++ b/.ai/pages/develop-toolkit-parachains-spawn-chains.md @@ -29,7 +29,6 @@ Spawning a network provides a controlled environment to test and validate variou

Spawn a Chain with Zombienet

-

Learn to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

diff --git a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md index e2dd7bc3f..8575896b8 100644 --- a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md +++ b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md @@ -20,28 +20,24 @@ This section provides guidance on how to set up, activate, and deactivate your v

Review the Requirements

-

Explore the technical and system requirements for running a Polkadot validator, including setup, hardware, staking prerequisites, and security best practices.

Learn About Staking Mechanics

-

Explore the staking mechanics in Polkadot, focusing on how they relate to validators, including offenses and slashes, as well as reward payouts.

Maintain Your Node

-

Learn how to manage your Polkadot validator node, including monitoring performance, running a backup validator for maintenance, and rotating keys.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/.ai/pages/infrastructure-running-a-validator-operational-tasks.md b/.ai/pages/infrastructure-running-a-validator-operational-tasks.md index 2f1648250..f54ea10d1 100644 --- a/.ai/pages/infrastructure-running-a-validator-operational-tasks.md +++ b/.ai/pages/infrastructure-running-a-validator-operational-tasks.md @@ -18,14 +18,12 @@ Running a Polkadot validator node involves several key operational tasks to ensu

Access Real-Time Validator Metrics

-

Check the Polkadot Telemetry dashboard for real-time insights into node performance, including validator status, connectivity, block production, and software version to identify potential issues.

Stay Up to Date with Runtime Upgrades

-

Learn how to monitor the Polkadot network for upcoming upgrades, so you can prepare your validator node for any required updates or modifications.

diff --git a/.ai/pages/infrastructure-running-a-validator.md b/.ai/pages/infrastructure-running-a-validator.md index d3a6cd49f..09f3d534b 100644 --- a/.ai/pages/infrastructure-running-a-validator.md +++ b/.ai/pages/infrastructure-running-a-validator.md @@ -20,21 +20,18 @@ Learn the requirements for setting up a Polkadot validator node, along with deta

Explore Rewards, Offenses, and Slashes

-

Learn about Polkadot's offenses and slashing system, along with validator rewards, era points, and nominator payments.

Check Out the Decentralized Nodes Program

-

The Decentralized Nodes program aims to support Polkadot's security and decentralization by involving a diverse set of validators. Learn more and apply.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/.ai/pages/infrastructure-staking-mechanics.md b/.ai/pages/infrastructure-staking-mechanics.md index 2ecbb1440..f8c35fc33 100644 --- a/.ai/pages/infrastructure-staking-mechanics.md +++ b/.ai/pages/infrastructure-staking-mechanics.md @@ -18,21 +18,18 @@ Gain a deep understanding of the staking mechanics in Polkadot, with a focus on

Learn About Nominated Proof of Staking

-

Take a deeper dive into the fundamentals of Polkadot's Nominated Proof of Stake (NPoS) consensus mechanism.

Dive Deep into Slashing Mechanisms

-

Read the Web3 Foundation's research article on slashing mechanisms for a comprehensive understanding of slashing, along with an in-depth examination of the offenses involved.

Review Validator Rewards Metrics

-

Check out Dune's Polkadot Staking Rewards dashboard for a detailed look at validator-specific metrics over time, such as daily staking rewards, nominators count, reward points, and more.

diff --git a/.ai/pages/tutorials-dapps.md b/.ai/pages/tutorials-dapps.md index 940a4746f..816de3147 100644 --- a/.ai/pages/tutorials-dapps.md +++ b/.ai/pages/tutorials-dapps.md @@ -20,14 +20,12 @@ You'll explore a range of topics—from client-side apps and CLI tools to on-cha

Polkadot API (PAPI)

-

Learn how to use the Polkadot API to build dApps that interact with Polkadot SDK-based chains directly via RPC or light clients.

Start Building on Polkadot

-

Get an overview of the tools, SDKs, and templates available for building with Polkadot—from runtime development to frontend integration.

diff --git a/.ai/pages/tutorials-interoperability-xcm-channels.md b/.ai/pages/tutorials-interoperability-xcm-channels.md index 3f1856040..d0e1948a3 100644 --- a/.ai/pages/tutorials-interoperability-xcm-channels.md +++ b/.ai/pages/tutorials-interoperability-xcm-channels.md @@ -26,7 +26,6 @@ To enable communication between parachains, explicit HRMP channels must be estab

Review HRMP Configurations and Extrinsics

-

Learn about the configurable parameters that govern HRMP channel behavior and the dispatchable extrinsics used to manage them.

diff --git a/.ai/pages/tutorials-interoperability.md b/.ai/pages/tutorials-interoperability.md index 39acf65bd..5f4e222a2 100644 --- a/.ai/pages/tutorials-interoperability.md +++ b/.ai/pages/tutorials-interoperability.md @@ -31,14 +31,12 @@ Learn to establish and use cross-chain communication channels:

Learn about Polkadot's Interoperability

-

Explore the importance of interoperability in the Polkadot ecosystem, covering XCM, bridges, and cross-chain communication.

Explore Comprehensive XCM Guides

-

Looking for comprehensive guides and technical resources on XCM? Explore foundational concepts, advanced configuration, and best practices for building cross-chain solutions using XCM.

diff --git a/.ai/pages/tutorials-onchain-governance.md b/.ai/pages/tutorials-onchain-governance.md index 0f5d97d54..55d20346a 100644 --- a/.ai/pages/tutorials-onchain-governance.md +++ b/.ai/pages/tutorials-onchain-governance.md @@ -20,7 +20,6 @@ This section provides step-by-step tutorials to help you navigate the technical

Learn More About Polkadot's OpenGov

-

Explore Polkadot's decentralized on-chain governance system, OpenGov, including how it works, the proposal process, and key info for developers.

diff --git a/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md b/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md index 8cfc67b67..ae1b50408 100644 --- a/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md +++ b/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md @@ -35,7 +35,6 @@ Through these tutorials, you'll learn how to manage cross-chain assets, includin

Learn More About Asset Hub

-

Explore the fundamentals of Asset Hub, including managing on-chain assets, foreign asset integration, and using XCM for cross-chain asset transfers.

diff --git a/.ai/pages/tutorials-polkadot-sdk.md b/.ai/pages/tutorials-polkadot-sdk.md index 9cc3bfb54..feb7470eb 100644 --- a/.ai/pages/tutorials-polkadot-sdk.md +++ b/.ai/pages/tutorials-polkadot-sdk.md @@ -28,7 +28,6 @@ Follow these key milestones to guide you through parachain development. Each ste

View the Polkadot SDK Source Code

-

Check out the Polkadot SDK repository on GitHub to explore the source code and stay updated on the latest releases.

diff --git a/.ai/pages/tutorials.md b/.ai/pages/tutorials.md index a97c290d5..fefe297bf 100644 --- a/.ai/pages/tutorials.md +++ b/.ai/pages/tutorials.md @@ -20,7 +20,6 @@ The Zero to Hero series offers step-by-step guidance to development across the P

Parachain Zero to Hero

-

Begin with a template then follow this series of step-by-step guides to add pallets, write unit tests and benchmarking, run your parachain locally, perform runtime upgrades, deploy to TestNet, and obtain coretime.

@@ -32,28 +31,24 @@ The Zero to Hero series offers step-by-step guidance to development across the P

Set Up a Template

-

Learn to compile and run a local parachain node using Polkadot SDK. Launch, run, and interact with a pre-configured runtime template.

Build a Custom Pallet

-

Learn how to build a custom pallet for Polkadot SDK-based blockchains with this step-by-step guide. Create and configure a simple counter pallet from scratch.

Fork a Live Chain with Chopsticks

-

Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI.

Open an XCM Channel

-

Learn how to open HRMP channels between parachains on Polkadot. Discover the step-by-step process for establishing uni- and bidirectional communication.

diff --git a/.ai/site-index.json b/.ai/site-index.json index e68037363..08f3be47f 100644 --- a/.ai/site-index.json +++ b/.ai/site-index.json @@ -930,12 +930,12 @@ } ], "stats": { - "chars": 1663, - "words": 215, + "chars": 1619, + "words": 211, "headings": 2, - "estimated_token_count_total": 340 + "estimated_token_count_total": 328 }, - "hash": "sha256:d4c2d7fd46ddf60f638f948c88ba3940de6d69f140923ba8df52ed787b0afede", + "hash": "sha256:30ffcc12fff151fd0fa1baedfa803ecbb15106504df99c5a032ca173fffe0eca", "token_estimator": "heuristic-v1" }, { @@ -1028,12 +1028,12 @@ } ], "stats": { - "chars": 2375, - "words": 326, + "chars": 2331, + "words": 322, "headings": 2, - "estimated_token_count_total": 402 + "estimated_token_count_total": 390 }, - "hash": "sha256:addb6b5bc11163daf47aef933fe23c7d1dcbc20fa4529090b3bbedf49bde4aa5", + "hash": "sha256:3ca63851d29942ed00d3f143930edd1248a6321d1d8f6473ae697b72b0b9116e", "token_estimator": "heuristic-v1" }, { @@ -1412,12 +1412,12 @@ } ], "stats": { - "chars": 1899, - "words": 259, + "chars": 1866, + "words": 256, "headings": 2, - "estimated_token_count_total": 335 + "estimated_token_count_total": 326 }, - "hash": "sha256:9a08b66442c564c7116c686d8914b74ad617326f450d0894b05e753462f69aac", + "hash": "sha256:705127e925f797216ab35ca7d0b4bb4fe56ee3c252318d35678e2d5f330a6571", "token_estimator": "heuristic-v1" }, { @@ -1635,7 +1635,7 @@ "headings": 12, "estimated_token_count_total": 3071 }, - "hash": "sha256:f89b54fce05c6e26b58bc8a38694953422faf4a3559799a7d2f70dcfd6176304", + "hash": "sha256:cc05e98509fc084a335177984baee35b16eb4388ac1df513edd730b889b782cc", "token_estimator": "heuristic-v1" }, { @@ -1768,12 +1768,12 @@ } ], "stats": { - "chars": 4806, - "words": 655, + "chars": 4795, + "words": 654, "headings": 3, - "estimated_token_count_total": 960 + "estimated_token_count_total": 957 }, - "hash": "sha256:32ff8711945e175aa7d073821a38320588997365d65b84a34aa36734066fc898", + "hash": "sha256:1acbec60b62ffd3359fa04d224e8be0154d6d115a65b2e33d119905d382a7f17", "token_estimator": "heuristic-v1" }, { @@ -2218,12 +2218,12 @@ } ], "stats": { - "chars": 1372, - "words": 174, + "chars": 1350, + "words": 172, "headings": 2, - "estimated_token_count_total": 236 + "estimated_token_count_total": 230 }, - "hash": "sha256:3b0a9e8037c7634c33ac6674170bd763599fca914855d9d2fbf490d359140130", + "hash": "sha256:f786ec04fd5c7179716a160f93f6bff3c497839cc3627e2279d9bec234ce0c3c", "token_estimator": "heuristic-v1" }, { @@ -2422,12 +2422,12 @@ } ], "stats": { - "chars": 1243, - "words": 157, + "chars": 1221, + "words": 155, "headings": 2, - "estimated_token_count_total": 211 + "estimated_token_count_total": 205 }, - "hash": "sha256:0ce1fe38de00827a0735b9fa8076492205c2450c61da9fbd1937d9f38cfe7825", + "hash": "sha256:16f4f67b56ecef53c3c7ab09c438dcc9d4e613b0824df5b1691bd7c4f6296eda", "token_estimator": "heuristic-v1" }, { @@ -4415,12 +4415,12 @@ } ], "stats": { - "chars": 1096, - "words": 141, + "chars": 1074, + "words": 139, "headings": 2, - "estimated_token_count_total": 191 + "estimated_token_count_total": 185 }, - "hash": "sha256:9df26b2d1c10327a2880d45d6a704664926a42511b6c3ec9fc63d185bdfe563e", + "hash": "sha256:900e54d04b11533efb15a34ebd76dc095a1873a926ecf2a5ce494cf0633c8be1", "token_estimator": "heuristic-v1" }, { @@ -5066,12 +5066,12 @@ } ], "stats": { - "chars": 1495, - "words": 201, + "chars": 1473, + "words": 199, "headings": 3, - "estimated_token_count_total": 291 + "estimated_token_count_total": 285 }, - "hash": "sha256:b568596033cdf68e60d72bcb7ee62a794def2bd3ff5b3317ef15895f58a12c57", + "hash": "sha256:340c8e81fdaca8a1b85a9addeed75cc617513b39658250693e2516c74b86aa6e", "token_estimator": "heuristic-v1" }, { @@ -5102,12 +5102,12 @@ } ], "stats": { - "chars": 1295, - "words": 176, + "chars": 1284, + "words": 175, "headings": 3, - "estimated_token_count_total": 183 + "estimated_token_count_total": 180 }, - "hash": "sha256:67be1f6e1199f4ef40e8d5394e8d472f5289b5a9ad384647a03db98b79229c8f", + "hash": "sha256:6e71534a424f6a08521b19c9b4cf668e495fb7c591463ffe63d1b03a8b17e435", "token_estimator": "heuristic-v1" }, { @@ -5636,12 +5636,12 @@ } ], "stats": { - "chars": 1237, - "words": 164, + "chars": 1226, + "words": 163, "headings": 3, - "estimated_token_count_total": 193 + "estimated_token_count_total": 190 }, - "hash": "sha256:1355969b6b0e723b42815b960c15eb128e4d936d0d707cd66e43820cff765ee3", + "hash": "sha256:1e474a9a1411a128abe943bdfabd8d5d27eaa7b52c5ba4c68379964fd27c6983", "token_estimator": "heuristic-v1" }, { @@ -5672,12 +5672,12 @@ } ], "stats": { - "chars": 1199, - "words": 157, + "chars": 1188, + "words": 156, "headings": 3, - "estimated_token_count_total": 171 + "estimated_token_count_total": 168 }, - "hash": "sha256:4fca64fa791400e9177f6cf3a913c8d041a9ea0c93e3a24d91478867da150288", + "hash": "sha256:e26ea88a73f187ffbf9c7287f80b9e51604b92896b7c032b26b3d034d3c46b7d", "token_estimator": "heuristic-v1" }, { @@ -6513,12 +6513,12 @@ } ], "stats": { - "chars": 1975, - "words": 257, + "chars": 1931, + "words": 253, "headings": 2, - "estimated_token_count_total": 416 + "estimated_token_count_total": 404 }, - "hash": "sha256:f86c2598c9296ca2d09e514ce512440b87f2f11476d2aa9bf0296d2c748b6c96", + "hash": "sha256:dad21b50f3732256f1367c2e79857f102635f0ed3015ed4013d7d0ca4d8b3a99", "token_estimator": "heuristic-v1" }, { @@ -6762,12 +6762,12 @@ } ], "stats": { - "chars": 1520, - "words": 203, + "chars": 1498, + "words": 201, "headings": 2, - "estimated_token_count_total": 236 + "estimated_token_count_total": 230 }, - "hash": "sha256:c4c79b14ccefef842c387775b22d2e42da7e136d495ccc2ea4dfed9dcd894667", + "hash": "sha256:f2cced19ba2b0b1ea46fd2f2892d328ac4797a1253d12cf479c64a447d7ce1ee", "token_estimator": "heuristic-v1" }, { @@ -6839,12 +6839,12 @@ } ], "stats": { - "chars": 1603, - "words": 218, + "chars": 1570, + "words": 215, "headings": 2, - "estimated_token_count_total": 319 + "estimated_token_count_total": 310 }, - "hash": "sha256:4771f0d30573e36eee1b14cdf3bbca70a4716a3ee8dc8726516487d01a87587c", + "hash": "sha256:7fb05e7b43cd5413b248605912ae0c6e24fd6c1ca199e64059c686c49d8cc456", "token_estimator": "heuristic-v1" }, { @@ -6987,12 +6987,12 @@ } ], "stats": { - "chars": 1824, - "words": 249, + "chars": 1791, + "words": 246, "headings": 2, - "estimated_token_count_total": 352 + "estimated_token_count_total": 343 }, - "hash": "sha256:073052dcb2d4852bd40338cce41f3aec28ec31f93b6170a4daa1c0c21d54b7cb", + "hash": "sha256:20f272dbbeb2b50a5e240b53ac45bed797ea58aa03e27c89194c941d66d8accf", "token_estimator": "heuristic-v1" }, { @@ -9876,12 +9876,12 @@ } ], "stats": { - "chars": 1220, - "words": 171, + "chars": 1198, + "words": 169, "headings": 2, - "estimated_token_count_total": 202 + "estimated_token_count_total": 196 }, - "hash": "sha256:479cbefd4369c5c83ff5ed3aca93db88889fd93f1ffaed7303f60fcd54fc77ce", + "hash": "sha256:fb892e81a2add1b64214c6cabe837d5068ebe54a7fb65e9149edbfb68f578a53", "token_estimator": "heuristic-v1" }, { @@ -10145,12 +10145,12 @@ } ], "stats": { - "chars": 1808, - "words": 239, + "chars": 1797, + "words": 238, "headings": 3, - "estimated_token_count_total": 208 + "estimated_token_count_total": 205 }, - "hash": "sha256:d32d46c63294ca7bbdb46d7b64fe28a6744910557c10b02c7fc3a55d53e577b4", + "hash": "sha256:c1f893d4086b0bf5d6b3c50d0c6cffe27d4deddf1240b250df6432eddcec969c", "token_estimator": "heuristic-v1" }, { @@ -10344,12 +10344,12 @@ } ], "stats": { - "chars": 2197, - "words": 272, + "chars": 2175, + "words": 270, "headings": 4, - "estimated_token_count_total": 349 + "estimated_token_count_total": 343 }, - "hash": "sha256:ba83e50c58f45330ce0a74e27d3f764cc1c710eda0fb3d4674b24cf9c87ff6ad", + "hash": "sha256:dcb1210e19815f3659ea283f8fc04dd5b85bc440a54e31e889f5ec5ae9229b78", "token_estimator": "heuristic-v1" }, { @@ -10456,12 +10456,12 @@ } ], "stats": { - "chars": 883, - "words": 114, + "chars": 872, + "words": 113, "headings": 2, - "estimated_token_count_total": 129 + "estimated_token_count_total": 126 }, - "hash": "sha256:cb856d135b9bcbc3c1c1a2713c70baf2d7a979a918d2f2d27fee82341412bb2b", + "hash": "sha256:b9c07713604ff9658363bf5e7a0726ecb7781418826ff65abffddffc8083d33f", "token_estimator": "heuristic-v1" }, { @@ -11375,12 +11375,12 @@ } ], "stats": { - "chars": 1778, - "words": 250, + "chars": 1767, + "words": 249, "headings": 4, - "estimated_token_count_total": 400 + "estimated_token_count_total": 397 }, - "hash": "sha256:20879ce95cc9ef66354350e0005642b0b3b35e412c0d4e34ad49aec3fd1a548c", + "hash": "sha256:9044f2d9bca77f3e0062a47a52c592c756384850b051cb5be4f9373cff79440d", "token_estimator": "heuristic-v1" }, { @@ -11614,12 +11614,12 @@ } ], "stats": { - "chars": 1489, - "words": 215, + "chars": 1478, + "words": 214, "headings": 3, - "estimated_token_count_total": 258 + "estimated_token_count_total": 255 }, - "hash": "sha256:60a52164328f3776b04f171212dc2407aa991f8f5b83a2d65855c96ca9ea06b2", + "hash": "sha256:189ac2cf8bfd44fc76a6f45f504effe4ea11653c5d86c7fa825b918fdbd4f564", "token_estimator": "heuristic-v1" }, { @@ -12179,12 +12179,12 @@ } ], "stats": { - "chars": 2498, - "words": 357, + "chars": 2443, + "words": 352, "headings": 4, - "estimated_token_count_total": 594 + "estimated_token_count_total": 579 }, - "hash": "sha256:3708031bfcbb55206c4a9aed14f4afbb9f742ea02ebb70dc350390f484c91a0b", + "hash": "sha256:eb544bbab067f4b3516190c6fe2df01049970bebb409095af4d3fd9b8bb771fe", "token_estimator": "heuristic-v1" } ] \ No newline at end of file diff --git a/develop/interoperability/index.md b/develop/interoperability/index.md index bda85fcce..da17828bd 100644 --- a/develop/interoperability/index.md +++ b/develop/interoperability/index.md @@ -21,28 +21,24 @@ This section covers everything you need to know about building and implementing

Review the Polkadot SDK's XCM Documentation

-

Dive into the official documentation to learn about the key components for supporting XCM in your parachain and enabling seamless cross-chain communication.

Follow Step-by-Step Tutorials

-

Enhance your XCM skills with step-by-step tutorials on building interoperability solutions on Polkadot SDK-based blockchains.

Familiarize Yourself with the XCM Format

-

Gain a deeper understanding of the XCM format and structure, including any extra data it may need and what each part of a message means.

Essential XCM Tools

-

Explore essential tools for creating and integrating cross-chain solutions within the Polkadot ecosystem.

diff --git a/develop/interoperability/xcm-guides/index.md b/develop/interoperability/xcm-guides/index.md index dc0c16157..38cf9f3ef 100644 --- a/develop/interoperability/xcm-guides/index.md +++ b/develop/interoperability/xcm-guides/index.md @@ -20,28 +20,24 @@ Whether you're building applications that need to interact with multiple chains,

Send XCM Messages

-

Learn the fundamentals of sending cross-chain messages using XCM, including message structure, routing, and execution patterns.

XCM Configuration

-

Learn how to configure XCM for your chain.

Test and Debug

-

Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution.

XCM Channels

-

Learn how to configure XCM channels for your chain.

diff --git a/develop/parachains/customize-parachain/index.md b/develop/parachains/customize-parachain/index.md index 1c4c41eaf..41d903493 100644 --- a/develop/parachains/customize-parachain/index.md +++ b/develop/parachains/customize-parachain/index.md @@ -20,21 +20,18 @@ The [FRAME directory](https://github.com/paritytech/polkadot-sdk/tree/{{dependen

FRAME Repository

-

View the source code of the FRAME development environment that provides pallets you can use, modify, and extend to build the runtime logic to suit the needs of your blockchain.

FRAME Rust docs

-

Check out the Rust documentation for FRAME, Substrate's preferred framework for building runtimes.

Polkadot SDK Best Practices

-

Understand and address common issues that can arise in blockchain development when building with the Polkadot SDK.

diff --git a/develop/parachains/deployment/index.md b/develop/parachains/deployment/index.md index ec45f2e10..a6d6d500a 100644 --- a/develop/parachains/deployment/index.md +++ b/develop/parachains/deployment/index.md @@ -71,7 +71,6 @@ flowchart TD

Check Out the Chain Spec Builder Docs

-

Learn about Substrate’s chain spec builder utility.

diff --git a/develop/parachains/maintenance/index.md b/develop/parachains/maintenance/index.md index 0a135999a..4eaaf37f3 100644 --- a/develop/parachains/maintenance/index.md +++ b/develop/parachains/maintenance/index.md @@ -18,14 +18,12 @@ Learn how to maintain Polkadot SDK-based networks, focusing on runtime monitorin

Single Block Migration Example

-

Check out an example pallet demonstrating best practices for writing single-block migrations while upgrading pallet storage.

Client Telemetry Crate

-

Check out the docs on Substrate's client telemetry, a part of Substrate that allows ingesting telemetry data with, for example, Polkadot telemetry.

diff --git a/develop/parachains/testing/index.md b/develop/parachains/testing/index.md index e542b335f..c35d9b44d 100644 --- a/develop/parachains/testing/index.md +++ b/develop/parachains/testing/index.md @@ -25,14 +25,12 @@ Through these guides, you'll learn to:

`sp_runtime` crate Rust docs

-

Learn about Substrate Runtime primitives that enable communication between a Substrate blockchain's runtime and client.

Moonwall Testing Framework

-

Moonwall is a comprehensive blockchain test framework for Substrate-based networks.

diff --git a/develop/toolkit/api-libraries/index.md b/develop/toolkit/api-libraries/index.md index 2a311a8e2..df16feeb7 100644 --- a/develop/toolkit/api-libraries/index.md +++ b/develop/toolkit/api-libraries/index.md @@ -18,14 +18,12 @@ Explore the powerful API libraries designed for interacting with the Polkadot ne

Understand Chain Data

-

Familiarize yourself with the data provided by the APIs, including available calls, events, types, and storage items.

Network Configurations

-

Obtain the necessary configurations and WSS endpoints to interact with the APIs on Polkadot networks.

diff --git a/develop/toolkit/parachains/fork-chains/chopsticks/index.md b/develop/toolkit/parachains/fork-chains/chopsticks/index.md index b98174a5c..8e4c1db90 100644 --- a/develop/toolkit/parachains/fork-chains/chopsticks/index.md +++ b/develop/toolkit/parachains/fork-chains/chopsticks/index.md @@ -28,14 +28,12 @@ Whether you're debugging an issue, testing new features, or exploring cross-chai

Chopsticks Repository

-

View the official Chopsticks Github Repository. Check out the code, check out sample commands, and track issues and new releases.

Fork Live Chains with Chopsticks

-

Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI.

diff --git a/develop/toolkit/parachains/fork-chains/index.md b/develop/toolkit/parachains/fork-chains/index.md index a25d6a5f0..d2e842e2d 100644 --- a/develop/toolkit/parachains/fork-chains/index.md +++ b/develop/toolkit/parachains/fork-chains/index.md @@ -29,7 +29,6 @@ Forking a live chain creates a controlled environment that mirrors live network

Step-by-Step Tutorial on Forking Live Chains with Chopsticks

-

This tutorial walks you through how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM execution.

diff --git a/develop/toolkit/parachains/spawn-chains/index.md b/develop/toolkit/parachains/spawn-chains/index.md index 12c7bb1fc..5f4ebf7e5 100644 --- a/develop/toolkit/parachains/spawn-chains/index.md +++ b/develop/toolkit/parachains/spawn-chains/index.md @@ -29,7 +29,6 @@ Spawning a network provides a controlled environment to test and validate variou

Spawn a Chain with Zombienet

-

Learn to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

\ No newline at end of file diff --git a/develop/toolkit/parachains/spawn-chains/zombienet/index.md b/develop/toolkit/parachains/spawn-chains/zombienet/index.md index 64ea769c2..b134cc1de 100644 --- a/develop/toolkit/parachains/spawn-chains/zombienet/index.md +++ b/develop/toolkit/parachains/spawn-chains/zombienet/index.md @@ -27,7 +27,6 @@ Whether you're building a new parachain or testing runtime upgrades, Zombienet p

Spawn a Chain with Zombienet Tutorial

-

Follow step-by-step instructions to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

diff --git a/infrastructure/running-a-validator/index.md b/infrastructure/running-a-validator/index.md index b6964df1b..a058f70b8 100644 --- a/infrastructure/running-a-validator/index.md +++ b/infrastructure/running-a-validator/index.md @@ -20,21 +20,18 @@ Learn the requirements for setting up a Polkadot validator node, along with deta

Explore Rewards, Offenses, and Slashes

-

Learn about Polkadot's offenses and slashing system, along with validator rewards, era points, and nominator payments.

Check Out the Decentralized Nodes Program

-

The Decentralized Nodes program aims to support Polkadot's security and decentralization by involving a diverse set of validators. Learn more and apply.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/infrastructure/running-a-validator/onboarding-and-offboarding/index.md b/infrastructure/running-a-validator/onboarding-and-offboarding/index.md index 5cb8a14d7..46a1de413 100644 --- a/infrastructure/running-a-validator/onboarding-and-offboarding/index.md +++ b/infrastructure/running-a-validator/onboarding-and-offboarding/index.md @@ -20,28 +20,24 @@ This section provides guidance on how to set up, activate, and deactivate your v

Review the Requirements

-

Explore the technical and system requirements for running a Polkadot validator, including setup, hardware, staking prerequisites, and security best practices.

Learn About Staking Mechanics

-

Explore the staking mechanics in Polkadot, focusing on how they relate to validators, including offenses and slashes, as well as reward payouts.

Maintain Your Node

-

Learn how to manage your Polkadot validator node, including monitoring performance, running a backup validator for maintenance, and rotating keys.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/infrastructure/running-a-validator/operational-tasks/index.md b/infrastructure/running-a-validator/operational-tasks/index.md index 38217d1fc..978f26417 100644 --- a/infrastructure/running-a-validator/operational-tasks/index.md +++ b/infrastructure/running-a-validator/operational-tasks/index.md @@ -18,14 +18,12 @@ Running a Polkadot validator node involves several key operational tasks to ensu

Access Real-Time Validator Metrics

-

Check the Polkadot Telemetry dashboard for real-time insights into node performance, including validator status, connectivity, block production, and software version to identify potential issues.

Stay Up to Date with Runtime Upgrades

-

Learn how to monitor the Polkadot network for upcoming upgrades, so you can prepare your validator node for any required updates or modifications.

diff --git a/infrastructure/staking-mechanics/index.md b/infrastructure/staking-mechanics/index.md index ea3dc258f..305ad7719 100644 --- a/infrastructure/staking-mechanics/index.md +++ b/infrastructure/staking-mechanics/index.md @@ -18,21 +18,18 @@ Gain a deep understanding of the staking mechanics in Polkadot, with a focus on

Learn About Nominated Proof of Staking

-

Take a deeper dive into the fundamentals of Polkadot's Nominated Proof of Stake (NPoS) consensus mechanism.

Dive Deep into Slashing Mechanisms

-

Read the Web3 Foundation's research article on slashing mechanisms for a comprehensive understanding of slashing, along with an in-depth examination of the offenses involved.

Review Validator Rewards Metrics

-

Check out Dune's Polkadot Staking Rewards dashboard for a detailed look at validator-specific metrics over time, such as daily staking rewards, nominators count, reward points, and more.

diff --git a/llms-full.jsonl b/llms-full.jsonl index ac87a067d..69c17b5df 100644 --- a/llms-full.jsonl +++ b/llms-full.jsonl @@ -104,7 +104,7 @@ {"page_id": "develop-interoperability-xcm-guides-from-apps-transfers", "page_title": "Transfers", "index": 3, "depth": 2, "title": "Origin Preservation", "anchor": "origin-preservation", "start_char": 11158, "end_char": 20278, "estimated_token_count": 1857, "token_estimator": "heuristic-v1", "text": "## Origin Preservation\n\nIn previous versions of XCM, doing cross-chain transfers meant losing the origin. The XCM on the destination chain would have access to the transferred assets, but not to the origin. This means any instruction which uses assets but not the origin could be executed, that's enough to call [`DepositAsset`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.DepositAsset){target=\\_blank} for example and complete the transfer, but not to call [`Transact`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.Transact){target=\\_blank} and execute a call.\n\nIn XCMv5, [`InitiateTransfer`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.InitiateTransfer){target=\\_blank} allows **preserving the origin**, enabling more use-cases such as executing a call on the destination chain via [`Transact`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.Transact){target=\\_blank}.\nTo enable this feature, the [`preserve_origin`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.InitiateTransfer.field.preserve_origin){target=\\_blank} parameter must be set to `true`.\n\n!!! note \"Why isn't preserving the origin the default?\"\n\n Preserving the origin requires a specific configuration on the underlying chain executing the XCM. Some chains have the right configuration, for example all system chains, but not every chain has it. If you make a transfer with `preserve_origin: true` to a chain configured incorrectly, the transfer will fail.\n\n However, if you set `preserve_origin: false` then there is no problem. Because of this, origin preservation is not the default, and likely never will be.\n\n??? code \"Teleport and Transact Example\"\n\n This example creates an XCM program that teleports DOT from Asset Hub to People and executes a call there. The whole script is almost the same as the one for a simple teleport above, most changes are in the `remoteXcm` variable.\n\n The setup for this script is [installing PAPI](/develop/toolkit/api-libraries/papi#get-started){target=\\_blank} and generating descriptors for both Asset Hub and People:\n `bun papi add ahp -n polkadot_asset_hub && bun papi add people -n polkadot_people`\n\n ```typescript title=\"teleport-and-transact.ts\"\n // `ahp` is the name given to `npx papi add`\n import {\n ahp,\n people,\n XcmV2OriginKind,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3MultiassetFungibility,\n XcmV5AssetFilter,\n XcmV5Instruction,\n XcmV5Junction,\n XcmV5Junctions,\n XcmV5WildAsset,\n XcmVersionedXcm,\n } from '@polkadot-api/descriptors';\n import { Binary, createClient, Enum, FixedSizeBinary } from 'polkadot-api';\n // import from \"polkadot-api/ws-provider/node\"\n // if running in a NodeJS environment\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import { sr25519CreateDerive } from '@polkadot-labs/hdkd';\n import {\n DEV_PHRASE,\n entropyToMiniSecret,\n mnemonicToEntropy,\n ss58Address,\n } from '@polkadot-labs/hdkd-helpers';\n import { getPolkadotSigner } from 'polkadot-api/signer';\n\n const entropy = mnemonicToEntropy(DEV_PHRASE);\n const miniSecret = entropyToMiniSecret(entropy);\n const derive = sr25519CreateDerive(miniSecret);\n const keyPair = derive('//Alice');\n\n const polkadotSigner = getPolkadotSigner(\n keyPair.publicKey,\n 'Sr25519',\n keyPair.sign\n );\n\n // Connect to Polkadot Asset Hub.\n // Pointing to localhost since this example uses chopsticks.\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('ws://localhost:8000'))\n );\n\n // Get the typed API, a typesafe API for interacting with the chain.\n const ahpApi = client.getTypedApi(ahp);\n\n const PEOPLE_PARA_ID = 1004;\n // The identifier for DOT is the location of the Polkadot Relay Chain,\n // which is 1 up relative to any parachain.\n const DOT = {\n parents: 1,\n interior: XcmV3Junctions.Here(),\n };\n // DOT has 10 decimals.\n const DOT_UNITS = 10_000_000_000n;\n\n // The DOT to withdraw for both fees and transfer.\n const dotToWithdraw = {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(10n * DOT_UNITS),\n };\n // The DOT to use for local fee payment.\n const dotToPayFees = {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(1n * DOT_UNITS),\n };\n // The location of the People Chain from Asset Hub.\n const destination = {\n parents: 1,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(PEOPLE_PARA_ID)),\n };\n // Pay for fees on the People Chain with teleported DOT.\n // This is specified independently of the transferred assets since they're used\n // exclusively for fees. Also because fees can be paid in a different\n // asset from the transferred assets.\n const remoteFees = Enum(\n 'Teleport',\n XcmV5AssetFilter.Definite([\n {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(1n * DOT_UNITS),\n },\n ])\n );\n // No need to preserve origin for this example.\n const preserveOrigin = false;\n // The assets to transfer are whatever remains in the\n // holding register at the time of executing the `InitiateTransfer`\n // instruction. DOT in this case, teleported.\n const assets = [\n Enum('Teleport', XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1))),\n ];\n // The beneficiary is the same account but on the People Chain.\n // This is a very common pattern for one public/private key pair\n // to hold assets on multiple chains.\n const beneficiary = FixedSizeBinary.fromBytes(keyPair.publicKey);\n // The call to be executed on the destination chain.\n // It's a simple remark with an event.\n // Create the call on Asset Hub since the system pallet is present in\n // every runtime, but if using any other pallet, connect to\n // the destination chain and create the call there.\n const remark = Binary.fromText('Hello, cross-chain!');\n const call = await ahpApi.tx.System.remark_with_event({\n remark,\n }).getEncodedData();\n // The XCM to be executed on the destination chain.\n // It's basically depositing everything to the beneficiary.\n const remoteXcm = [\n XcmV5Instruction.Transact({\n origin_kind: XcmV2OriginKind.SovereignAccount(),\n fallback_max_weight: undefined,\n call,\n }),\n XcmV5Instruction.RefundSurplus(),\n XcmV5Instruction.DepositAsset({\n assets: XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1)),\n beneficiary: {\n parents: 0,\n interior: XcmV5Junctions.X1(\n XcmV5Junction.AccountId32({\n id: beneficiary,\n network: undefined,\n })\n ),\n },\n }),\n ];\n\n // The message assembles all the previously defined parameters.\n const xcm = XcmVersionedXcm.V5([\n XcmV5Instruction.WithdrawAsset([dotToWithdraw]),\n XcmV5Instruction.PayFees({ asset: dotToPayFees }),\n XcmV5Instruction.InitiateTransfer({\n destination,\n remote_fees: remoteFees,\n preserve_origin: preserveOrigin,\n assets,\n remote_xcm: remoteXcm,\n }),\n // Return any leftover fees from the fees register back to holding.\n XcmV5Instruction.RefundSurplus(),\n // Deposit remaining assets (refunded fees) to the originating account.\n // Using AllCounted(1) since only one asset type (DOT) remains - a minor optimization.\n XcmV5Instruction.DepositAsset({\n assets: XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1)),\n beneficiary: {\n parents: 0,\n interior: XcmV5Junctions.X1(\n XcmV5Junction.AccountId32({\n id: beneficiary, // The originating account.\n network: undefined,\n })\n ),\n },\n }),\n ]);\n\n // The XCM weight is needed to set the `max_weight` parameter\n // on the actual `PolkadotXcm.execute()` call.\n const weightResult = await ahpApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n if (weightResult.success) {\n const weight = weightResult.success\n ? weightResult.value\n : { ref_time: 0n, proof_size: 0n };\n\n console.dir(weight);\n\n // The actual transaction to submit.\n // This tells Asset Hub to execute the XCM.\n const tx = ahpApi.tx.PolkadotXcm.execute({\n message: xcm,\n max_weight: weight,\n });\n\n // Sign and propagate to the network.\n const result = await tx.signAndSubmit(polkadotSigner);\n console.log(stringify(result));\n }\n\n client.destroy();\n\n // A helper function to print numbers inside of the result.\n function stringify(obj: any) {\n return JSON.stringify(\n obj,\n (_, v) => (typeof v === 'bigint' ? v.toString() : v),\n 2\n );\n }\n\n ```"} {"page_id": "develop-interoperability-xcm-guides-from-apps", "page_title": "From Apps", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 462, "end_char": 511, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 466, "end_char": 516, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 516, "end_char": 1663, "estimated_token_count": 328, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n
\n \n

Send XCM Messages

\n
\n

Learn the fundamentals of sending cross-chain messages using XCM, including message structure, routing, and execution patterns.

\n
\n
\n
\n \n

XCM Configuration

\n
\n

Learn how to configure XCM for your chain.

\n
\n
\n
\n \n

Test and Debug

\n
\n

Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution.

\n
\n
\n
\n \n

XCM Channels

\n
\n

Learn how to configure XCM channels for your chain.

\n
\n
\n
"} +{"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 516, "end_char": 1619, "estimated_token_count": 316, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n
\n \n

Send XCM Messages

\n

Learn the fundamentals of sending cross-chain messages using XCM, including message structure, routing, and execution patterns.

\n
\n
\n
\n \n

XCM Configuration

\n

Learn how to configure XCM for your chain.

\n
\n
\n
\n \n

Test and Debug

\n

Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution.

\n
\n
\n
\n \n

XCM Channels

\n

Learn how to configure XCM channels for your chain.

\n
\n
\n
"} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 20, "end_char": 932, "estimated_token_count": 159, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nRuntime APIs allow node-side code to extract information from the runtime state. While simple storage access retrieves stored values directly, runtime APIs enable arbitrary computation, making them a powerful tool for interacting with the chain's state.\n\nUnlike direct storage access, runtime APIs can derive values from storage based on arguments or perform computations that don't require storage access. For example, a runtime API might expose a formula for fee calculation, using only the provided arguments as inputs rather than fetching data from storage.\n\nIn general, runtime APIs are used for:\n\n- Accessing a storage item.\n- Retrieving a bundle of related storage items.\n- Deriving a value from storage based on arguments.\n- Exposing formulas for complex computational calculations.\n\nThis section will teach you about specific runtime APIs that support XCM processing and manipulation."} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 1, "depth": 2, "title": "Dry Run API", "anchor": "dry-run-api", "start_char": 932, "end_char": 1492, "estimated_token_count": 140, "token_estimator": "heuristic-v1", "text": "## Dry Run API\n\nThe [Dry-run API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/dry_run/trait.DryRunApi.html){target=\\_blank}, given an extrinsic, or an XCM program, returns its effects:\n\n- Execution result\n- Local XCM (in the case of an extrinsic)\n- Forwarded XCMs\n- List of events\n\nThis API can be used independently for dry-running, double-checking, or testing. However, it mainly shines when used with the [Xcm Payment API](#xcm-payment-api), given that it only estimates fees if you know the specific XCM you want to execute or send."} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 2, "depth": 3, "title": "Dry Run Call", "anchor": "dry-run-call", "start_char": 1492, "end_char": 10429, "estimated_token_count": 1647, "token_estimator": "heuristic-v1", "text": "### Dry Run Call\n\nThis API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains.\n\n```rust\nfn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `origin` ++\"OriginCaller\"++ ++\"required\"++\n \n The origin used for executing the transaction.\n\n ---\n\n `call` ++\"Call\"++ ++\"required\"++\n\n The extrinsic to be executed.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `CallDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `local_xcm` ++\"Option>\"++\n\n The local XCM that was attempted to be executed, if any.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#reserve-asset-transfer){target=\\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { paseo } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n PolkadotRuntimeOriginCaller,\n XcmVersionedLocation,\n XcmVersionedAssets,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n } from '@polkadot-api/descriptors';\n import { DispatchRawOrigin } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to the Paseo relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const popParaID = 4001;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin caller\n // This is a regular signed account owned by a user\n let origin = PolkadotRuntimeOriginCaller.system(\n DispatchRawOrigin.Signed(userAddress),\n );\n\n // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer\n const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({\n dest: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain\n ),\n }),\n beneficiary: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n // Beneficiary address on Pop Network\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n }),\n assets: XcmVersionedAssets.V3([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 0,\n interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS\n }),\n fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer\n },\n ]),\n fee_asset_item: 0, // Asset used to pay transaction fees\n weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction\n });\n\n // Execute the dry run call to simulate the transaction\n const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call(\n origin,\n tx.decodedCall,\n );\n\n // Extract the data from the dry run result\n const {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n forwarded_xcms: forwardedXcms,\n } = dryRunResult.value;\n\n // Extract the XCM generated by this call\n const xcmsToPop = forwardedXcms.find(\n ([location, _]) =>\n location.type === 'V4' &&\n location.value.parents === 0 &&\n location.value.interior.type === 'X1' &&\n location.value.interior.value.type === 'Parachain' &&\n location.value.interior.value.value === popParaID, // Pop network's ParaID\n );\n const destination = xcmsToPop[0];\n const remoteXcm = xcmsToPop[1][0];\n\n // Print the results\n const resultObject = {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n destination: destination,\n remote_xcm: remoteXcm,\n };\n\n console.dir(resultObject, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            success: true,\n            value: {\n              actual_weight: undefined,\n              pays_fee: { type: 'Yes', value: undefined }\n            }\n          },\n          emitted_events: [\n                ...\n          ],\n          local_xcm: undefined,\n          destination: {\n            type: 'V4',\n            value: {\n              parents: 0,\n              interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }\n            }\n          },\n          remote_xcm: {\n            type: 'V3',\n            value: [\n              {\n                type: 'ReserveAssetDeposited',\n                value: [\n                  {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  }\n                ]\n              },\n              { type: 'ClearOrigin', value: undefined },\n              {\n                type: 'BuyExecution',\n                value: {\n                  fees: {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  },\n                  weight_limit: { type: 'Unlimited', value: undefined }\n                }\n              },\n              {\n                type: 'DepositAsset',\n                value: {\n                  assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },\n                  beneficiary: {\n                    parents: 0,\n                    interior: {\n                      type: 'X1',\n                      value: {\n                        type: 'AccountId32',\n                        value: {\n                          network: undefined,\n                          id: FixedSizeBinary {\n                            asText: [Function (anonymous)],\n                            asHex: [Function (anonymous)],\n                            asOpaqueHex: [Function (anonymous)],\n                            asBytes: [Function (anonymous)],\n                            asOpaqueBytes: [Function (anonymous)]\n                          }\n                        }\n                      }\n                    }\n                  }\n                }\n              },\n              {\n                type: 'SetTopic',\n                value: FixedSizeBinary {\n                  asText: [Function (anonymous)],\n                  asHex: [Function (anonymous)],\n                  asOpaqueHex: [Function (anonymous)],\n                  asBytes: [Function (anonymous)],\n                  asOpaqueBytes: [Function (anonymous)]\n                }\n              }\n            ]\n          }\n        }      \n      
\n
\n\n ---"} @@ -115,7 +115,7 @@ {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 7, "depth": 3, "title": "Query Weight to Asset Fee", "anchor": "query-weight-to-asset-fee", "start_char": 25168, "end_char": 28210, "estimated_token_count": 699, "token_estimator": "heuristic-v1", "text": "### Query Weight to Asset Fee\n\nConverts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset.\n\n```rust\nfn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `weight` ++\"Weight\"++ ++\"required\"++\n \n The execution weight to be converted into a fee.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ---\n\n `asset` ++\"VersionedAssetId\"++ ++\"required\"++\n \n The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The fee needed to pay for the execution for the given `AssetId.`\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the weight to convert to fee\n const weight = { ref_time: 15574200000n, proof_size: 359300n };\n\n // Define the versioned asset id\n const versionedAssetId = {\n type: 'V4',\n value: { parents: 1, interior: { type: 'Here', value: undefined } },\n };\n\n // Execute the runtime call to convert the weight to fee\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee(\n weight,\n versionedAssetId,\n );\n\n // Print the fee\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n 1796500000n\n
\n\n ---"} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 8, "depth": 3, "title": "Query Delivery Fees", "anchor": "query-delivery-fees", "start_char": 28210, "end_char": 33184, "estimated_token_count": 965, "token_estimator": "heuristic-v1", "text": "### Query Delivery Fees\n\nRetrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain.\n\n```rust\nfn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `destination` ++\"VersionedLocation\"++ ++\"required\"++\n \n The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms.\n\n ---\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseo,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const paseoAssetHubParaID = 1000;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the destination\n const destination = XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)),\n });\n\n // Define the xcm message that will be sent to the destination\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query delivery fees runtime call\n const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees(\n destination,\n xcm,\n );\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          type: 'V3',\n          value: [\n            {\n              id: {\n                type: 'Concrete',\n                value: { parents: 0, interior: { type: 'Here', value: undefined } }\n              },\n              fun: { type: 'Fungible', value: 396000000n }\n            }\n          ]\n        }\n      
\n
\n\n ---"} {"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 872, "end_char": 922, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 922, "end_char": 2375, "estimated_token_count": 390, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 922, "end_char": 2331, "estimated_token_count": 378, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-networks", "page_title": "Networks", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 12, "end_char": 513, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nThe Polkadot ecosystem consists of multiple networks designed to support different stages of blockchain development, from main networks to test networks. Each network serves a unique purpose, providing developers with flexible environments for building, testing, and deploying blockchain applications.\n\nThis section includes essential network information such as RPC endpoints, currency symbols and decimals, and how to acquire TestNet tokens for the Polkadot ecosystem of networks."} {"page_id": "develop-networks", "page_title": "Networks", "index": 1, "depth": 2, "title": "Production Networks", "anchor": "production-networks", "start_char": 513, "end_char": 537, "estimated_token_count": 4, "token_estimator": "heuristic-v1", "text": "## Production Networks"} {"page_id": "develop-networks", "page_title": "Networks", "index": 2, "depth": 3, "title": "Polkadot", "anchor": "polkadot", "start_char": 537, "end_char": 1992, "estimated_token_count": 382, "token_estimator": "heuristic-v1", "text": "### Polkadot\n\nPolkadot is the primary production blockchain network for high-stakes, enterprise-grade applications. Polkadot MainNet has been running since May 2020 and has implementations in various programming languages ranging from Rust to JavaScript.\n\n=== \"Network Details\"\n\n **Currency symbol**: `DOT`\n\n ---\n \n **Currency decimals**: 10\n\n ---\n\n **Block explorer**: [Polkadot Subscan](https://polkadot.subscan.io/){target=\\_blank}\n\n=== \"RPC Endpoints\"\n\n Blockops\n\n ```\n wss://polkadot-public-rpc.blockops.network/ws\n ```\n\n ---\n\n Dwellir\n\n ```\n wss://polkadot-rpc.dwellir.com\n ```\n\n ---\n\n Dwellir Tunisia\n\n ```\n wss://polkadot-rpc-tn.dwellir.com\n ```\n\n ---\n\n IBP1\n\n ```\n wss://rpc.ibp.network/polkadot\n ```\n\n ---\n\n IBP2\n\n ```\n wss://polkadot.dotters.network\n ```\n\n ---\n\n LuckyFriday\n\n ```\n wss://rpc-polkadot.luckyfriday.io\n ```\n\n ---\n\n OnFinality\n\n ```\n wss://polkadot.api.onfinality.io/public-ws\n ```\n\n ---\n\n RadiumBlock\n\n ```\n wss://polkadot.public.curie.radiumblock.co/ws\n ```\n\n ---\n\n RockX\n\n ```\n wss://rockx-dot.w3node.com/polka-public-dot/ws\n ```\n\n ---\n\n Stakeworld\n\n ```\n wss://dot-rpc.stakeworld.io\n ```\n\n ---\n\n SubQuery\n\n ```\n wss://polkadot.rpc.subquery.network/public/ws\n ```\n\n ---\n\n Light client\n\n ```\n light://substrate-connect/polkadot\n ```"} @@ -162,7 +162,7 @@ {"page_id": "develop-parachains-customize-parachain-overview", "page_title": "Overview of FRAME", "index": 7, "depth": 3, "title": "Parachain Templates", "anchor": "parachain-templates", "start_char": 7588, "end_char": 9140, "estimated_token_count": 338, "token_estimator": "heuristic-v1", "text": "### Parachain Templates\n\nParachain templates are specifically designed for chains that will connect to and interact with relay chains in the Polkadot ecosystem:\n\n- **[`parachain-template`](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\\_blank}**: Designed for connecting to relay chains like Polkadot, Kusama, or Paseo, this template enables a chain to operate as a parachain. For projects aiming to integrate with Polkadot’s ecosystem, this template offers a great starting point.\n\n- **[`OpenZeppelin`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main){target=\\_blank}**: Offers two flexible starting points.\n - The [`generic-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/generic-template){target=\\_blank} provides a minimal setup with essential pallets and secure defaults, creating a reliable foundation for custom blockchain development.\n - The [`evm-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/evm-template){target=\\_blank} enables EVM compatibility, allowing developers to migrate Solidity contracts and EVM-based dApps. This template is ideal for Ethereum developers looking to leverage Substrate's capabilities.\n\nChoosing a suitable template depends on your project’s unique requirements, level of customization, and integration needs. Starting from a template speeds up development and lets you focus on implementing your chain’s unique features rather than the foundational blockchain setup."} {"page_id": "develop-parachains-customize-parachain-overview", "page_title": "Overview of FRAME", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 9140, "end_char": 9439, "estimated_token_count": 71, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\nFor more detailed information on implementing this process, refer to the following sections:\n\n- [Add a Pallet to Your Runtime](/develop/parachains/customize-parachain/add-existing-pallets/)\n- [Create a Custom Pallet](/develop/parachains/customize-parachain/make-custom-pallet/)"} {"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 671, "end_char": 721, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 721, "end_char": 1899, "estimated_token_count": 323, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 721, "end_char": 1866, "estimated_token_count": 314, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 33, "end_char": 1201, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBy default, the Rust compiler produces optimized Wasm binaries. These binaries are suitable for working in an isolated environment, such as local development. However, the Wasm binaries the compiler builds by default aren't guaranteed to be deterministically reproducible. Each time the compiler generates the Wasm runtime, it might produce a slightly different Wasm byte code. This is problematic in a blockchain network where all nodes must use exactly the same raw chain specification file.\n\nWorking with builds that aren't guaranteed to be deterministically reproducible can cause other problems, too. For example, for automating the build processes for a blockchain, it is ideal that the same code always produces the same result (in terms of bytecode). Compiling the Wasm runtime with every push would produce inconsistent and unpredictable results without a deterministic build, making it difficult to integrate with any automation and likely to break a CI/CD pipeline continuously. Deterministic builds—code that always compiles to exactly the same bytecode—ensure that the Wasm runtime can be inspected, audited, and independently verified."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1201, "end_char": 1327, "estimated_token_count": 37, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, ensure you have [Docker](https://www.docker.com/get-started/){target=\\_blank} installed."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 2, "depth": 2, "title": "Tooling for Wasm Runtime", "anchor": "tooling-for-wasm-runtime", "start_char": 1327, "end_char": 2108, "estimated_token_count": 173, "token_estimator": "heuristic-v1", "text": "## Tooling for Wasm Runtime\n\nTo compile the Wasm runtime deterministically, the same tooling that produces the runtime for Polkadot, Kusama, and other Polkadot SDK-based chains can be used. This tooling, referred to collectively as the Substrate Runtime Toolbox or [`srtool`](https://github.com/paritytech/srtool){target=\\_blank}, ensures that the same source code consistently compiles to an identical Wasm blob.\n\nThe core component of `srtool` is a Docker container executed as part of a Docker image. The name of the `srtool` Docker image specifies the version of the Rust compiler used to compile the code included in the image. For example, the image `paritytech/srtool:1.88.0` indicates that the code in the image was compiled with version `1.88.0` of the `rustc` compiler."} @@ -208,7 +208,7 @@ {"page_id": "develop-parachains-deployment-obtain-coretime", "page_title": "Obtain Coretime", "index": 5, "depth": 3, "title": "On-demand Coretime", "anchor": "on-demand-coretime", "start_char": 3260, "end_char": 3713, "estimated_token_count": 97, "token_estimator": "heuristic-v1", "text": "### On-demand Coretime\n\nOn-demand coretime allows for flexible, as-needed block production. To purchase:\n\n1. Ensure your collator node is fully synchronized with the relay chain.\n2. Submit the `onDemand.placeOrderAllowDeath` extrinsic on the relay chain with:\n\n - **`maxAmountFor`**: Sufficient funds for the transaction.\n - **`paraId`**: Your registered `ParaID`.\n\nAfter successfully executing the extrinsic, your parachain will produce a block."} {"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 0, "depth": 2, "title": "Deployment Process", "anchor": "deployment-process", "start_char": 371, "end_char": 4371, "estimated_token_count": 837, "token_estimator": "heuristic-v1", "text": "## Deployment Process\n\nTaking your Polkadot SDK-based blockchain from a local environment to production involves several steps, ensuring your network is stable, secure, and ready for real-world use. The following diagram outlines the process at a high level:\n\n```mermaid\nflowchart TD\n %% Group 1: Pre-Deployment\n subgraph group1 [Pre-Deployment]\n direction LR\n A(\"Local
Development
and Testing\") --> B(\"Runtime
Compilation\")\n B --> C(\"Generate
Chain
Specifications\")\n C --> D(\"Prepare
Deployment
Environment\")\n D --> E(\"Acquire
Coretime\")\n end\n \n %% Group 2: Deployment\n subgraph group2 [Deployment]\n F(\"Launch
and
Monitor\")\n end\n\n %% Group 3: Post-Deployment\n subgraph group3 [Post-Deployment]\n G(\"Maintenance
and
Upgrades\")\n end\n\n %% Connections Between Groups\n group1 --> group2\n group2 --> group3\n\n %% Styling\n style group1 stroke:#6e7391,stroke-width:1px\n style group2 stroke:#6e7391,stroke-width:1px\n style group3 stroke:#6e7391,stroke-width:1px\n```\n\n- **Local development and testing**: The process begins with local development and testing. Developers focus on building the runtime by selecting and configuring the necessary pallets while refining network features. In this phase, running a local TestNet is essential to verify transactions and ensure the blockchain behaves as expected. Unit and integration tests ensure the network works as expected before launch. Thorough testing is conducted, not only for individual components but also for interactions between pallets.\n\n- **Runtime compilation**: Polkadot SDK-based blockchains are built with Wasm, a highly portable and efficient format. Compiling your blockchain's runtime into Wasm ensures it can be executed reliably across various environments, guaranteeing network-wide compatibility and security. The [srtool](https://github.com/paritytech/srtool){target=\\_blank} is helpful for this purpose since it allows you to compile [deterministic runtimes](/develop/parachains/deployment/build-deterministic-runtime/){target=\\_blank}.\n\n- **Generate chain specifications**: The chain spec file defines the structure and configuration of your blockchain. It includes initial node identities, session keys, and other parameters. Defining a well-thought-out chain specification ensures that your network will operate smoothly and according to your intended design.\n\n- **Deployment environment**: Whether launching a local test network or a production-grade blockchain, selecting the proper infrastructure is vital. For further information about these topics, see the [Infrastructure](/infrastructure/){target=\\_blank} section.\n\n- **Acquire coretime**: To build on top of the Polkadot network, users need to acquire coretime (either on-demand or in bulk) to access the computational resources of the relay chain. This allows for the secure validation of parachain blocks through a randomized selection of relay chain validators.\n\n If you’re building a standalone blockchain (solochain) that won’t connect to Polkadot as a parachain, you can skip the preceding step, as there’s no need to acquire coretime or implement [Cumulus](/develop/parachains/#cumulus){target=\\_blank}.\n\n- **Launch and monitor**: Once everything is configured, you can launch the blockchain, initiating the network with your chain spec and Wasm runtime. Validators or collators will begin producing blocks, and the network will go live. Post-launch, monitoring is vital to ensuring network health—tracking block production, node performance, and overall security.\n\n- **Maintenance and upgrade**: A blockchain continues to evolve post-deployment. As the network expands and adapts, it may require runtime upgrades, governance updates, coretime renewals, and even modifications to the underlying code. For an in-depth guide on this topic, see the [Maintenance](/develop/parachains/maintenance/){target=\\_blank} section."} {"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 4371, "end_char": 4421, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 4421, "end_char": 4806, "estimated_token_count": 111, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 4421, "end_char": 4795, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 0, "depth": 2, "title": "macOS", "anchor": "macos", "start_char": 324, "end_char": 463, "estimated_token_count": 25, "token_estimator": "heuristic-v1", "text": "## macOS\n\nYou can install Rust and set up a Substrate development environment on Apple macOS computers with Intel or Apple M1 processors."} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 1, "depth": 3, "title": "Before You Begin", "anchor": "before-you-begin", "start_char": 463, "end_char": 1915, "estimated_token_count": 334, "token_estimator": "heuristic-v1", "text": "### Before You Begin\n\nBefore you install Rust and set up your development environment on macOS, verify that your computer meets the following basic requirements:\n\n- Operating system version is 10.7 Lion or later.\n- Processor speed of at least 2 GHz. Note that 3 GHz is recommended.\n- Memory of at least 8 GB RAM. Note that 16 GB is recommended.\n- Storage of at least 10 GB of available space.\n- Broadband Internet connection.\n\n#### Install Homebrew\n\nIn most cases, you should use Homebrew to install and manage packages on macOS computers. If you don't already have Homebrew installed on your local computer, you should download and install it before continuing.\n\nTo install Homebrew:\n\n1. Open the Terminal application.\n2. Download and install Homebrew by running the following command:\n\n ```bash\n /bin/bash -c \"$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)\"\n ```\n\n3. Verify Homebrew has been successfully installed by running the following command:\n\n ```bash\n brew --version\n ```\n\n The command displays output similar to the following:\n\n
\n brew --version\n Homebrew 4.3.15\n
\n\n#### Support for Apple Silicon\n\nProtobuf must be installed before the build process can begin. To install it, run the following command:\n\n```bash\nbrew install protobuf\n```"} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 2, "depth": 3, "title": "Install Required Packages and Rust", "anchor": "install-required-packages-and-rust", "start_char": 1915, "end_char": 3306, "estimated_token_count": 291, "token_estimator": "heuristic-v1", "text": "### Install Required Packages and Rust\n\nBecause the blockchain requires standard cryptography to support the generation of public/private key pairs and the validation of transaction signatures, you must also have a package that provides cryptography, such as `openssl`.\n\nTo install `openssl` and the Rust toolchain on macOS:\n\n1. Open the Terminal application.\n2. Ensure you have an updated version of Homebrew by running the following command:\n\n ```bash\n brew update\n ```\n\n3. Install the `openssl` package by running the following command:\n\n ```bash\n brew install openssl\n ```\n\n4. Download the `rustup` installation program and use it to install Rust by running the following command:\n\n ```bash\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh\n ```\n\n5. Follow the prompts displayed to proceed with a default installation.\n6. Update your current shell to include Cargo by running the following command:\n\n ```bash\n source ~/.cargo/env\n ```\n\n7. Configure the Rust toolchain to default to the latest stable version by running the following commands:\n\n ```bash\n rustup default stable\n rustup update\n rustup target add wasm32-unknown-unknown\n rustup component add rust-src\n ```\n\n8. [Verify your installation](#verifying-installation).\n9. Install `cmake` using the following command:\n\n ```bash\n brew install cmake\n ```"} @@ -264,7 +264,7 @@ {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 4, "depth": 3, "title": "Fund the Sovereign Account", "anchor": "fund-the-sovereign-account", "start_char": 4169, "end_char": 6045, "estimated_token_count": 414, "token_estimator": "heuristic-v1", "text": "### Fund the Sovereign Account\n\nFor a successful XCM execution, the [sovereign account](https://github.com/polkadot-fellows/xcm-format/blob/10726875bd3016c5e528c85ed6e82415e4b847d7/README.md?plain=1#L50){target=\\_blank} of your parachain on the relay chain must have sufficient funds to cover transaction fees. The sovereign account is a deterministic address derived from your parachain ID.\n\nYou can identify your parachain's sovereign account using either of these methods:\n\n=== \"Runtime API\"\n\n Execute the `locationToAccountApi.convertLocation` runtime API call to convert your parachain's location into its sovereign account address on the relay chain.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-7.webp)\n\n=== \"Substrate Utilities\"\n\n Use the **\"Para ID\" to Address** section in [Substrate Utilities](https://www.shawntabrizi.com/substrate-js-utilities/){target=\\_blank} with the **Child** option selected.\n\n=== \"Manual Calculation\"\n\n 1. Identify the appropriate prefix:\n\n - For parent/child chains use the prefix `0x70617261` (which decodes to `b\"para\"`).\n \n 2. Encode your parachain ID as a u32 [SCALE](/polkadot-protocol/parachain-basics/data-encoding#data-types){target=\\_blank} value:\n\n - For parachain 2006, this would be `d6070000`.\n\n 3. Combine the prefix with the encoded ID to form the sovereign account address:\n\n - **Hex**: `0x70617261d6070000000000000000000000000000000000000000000000000000`\n - **SS58 format**: `5Ec4AhPW97z4ZyYkd3mYkJrSeZWcwVv4wiANES2QrJi1x17F`\n\nYou can transfer funds to this account from any account on the relay chain using a standard transfer. To calculate the amount needed, refer to the [XCM Payment API](/develop/interoperability/xcm-runtime-apis/#xcm-payment-api){target=\\_blank}. The calculation will depend on the XCM built in the next step."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 5, "depth": 3, "title": "Craft and Submit the XCM", "anchor": "craft-and-submit-the-xcm", "start_char": 6045, "end_char": 9223, "estimated_token_count": 710, "token_estimator": "heuristic-v1", "text": "### Craft and Submit the XCM\n\nWith the call data prepared and the sovereign account funded, you can now construct and send the XCM from your parachain to the relay chain. The XCM will need to perform several operations in sequence:\n\n1. Withdraw DOT from your parachain's sovereign account.\n2. Buy execution to pay for transaction fees.\n3. Execute the `registrar.removeLock` extrinsic.\n4. Return any unused funds to your sovereign account.\n\nHere's how to submit this XCM using Astar (Parachain 2006) as an example:\n\n1. In [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\\_blank}, connect to the parachain, navigate to the **Developer** dropdown and select the **Extrinsics** option.\n\n2. Create a `sudo.sudo` extrinsic that executes `polkadotXcm.send`:\n\n 1. Use the `sudo.sudo` extrinsic to execute the following call as Root.\n 2. Select the **polkadotXcm** pallet.\n 3. Choose the **send** extrinsic.\n 4. Set the **dest** parameter as the relay chain.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-4.webp)\n\n3. Construct the XCM and submit it:\n\n 1. Add a **WithdrawAsset** instruction.\n 2. Add a **BuyExecution** instruction.\n - **fees**:\n - **id**: The asset location to use for the fee payment. In this example, the relay chain native asset is used.\n - **fun**: Select `Fungible` and use the same amount you withdrew from the sovereign account in the previous step.\n - **weightLimit**: Use `Unlimited`.\n 3. Add a **Transact** instruction with the following parameters:\n - **originKind**: Use `Native`.\n - **requireWeightAtMost**: Use the weight calculated previously.\n - **call**: Use the encoded call data generated before.\n 4. Add a **RefundSurplus** instruction.\n 5. Add a **DepositAsset** instruction to send the remaining funds to the parachain sovereign account.\n 6. Click the **Submit Transaction** button.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-5.webp)\n\n If the amount withdrawn in the first instruction is exactly the amount needed to pay the transaction fees, instructions 4 and 5 can be omitted.\n\n To validate your XCM, examine the following reference [extrinsic](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fastar.public.curie.radiumblock.co%2Fws#/extrinsics/decode/0x63003300040100041400040000000700e40b5402130000000700e40b540200060042d3c91800184604d6070000140d0100000100591f){target=_blank} showing the proper instruction sequence and parameter formatting. Following this structure will help ensure successful execution of your message.\n\nAfter submitting the transaction, wait for it to be finalized and then verify that your parachain has been successfully unlocked by following the steps described in the [Check if the Parachain is Locked](#check-if-the-parachain-is-locked) section. If the parachain shows as unlocked, your operation has been successful. If it still appears locked, verify that your XCM transaction was processed correctly and consider troubleshooting the XCM built.\n\n![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-6.webp)"} {"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 446, "end_char": 496, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 496, "end_char": 1372, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 496, "end_char": 1350, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 16, "end_char": 1221, "estimated_token_count": 239, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBenchmarking is a critical component of developing efficient and secure blockchain runtimes. In the Polkadot ecosystem, accurately benchmarking your custom pallets ensures that each extrinsic has a precise [weight](/polkadot-protocol/glossary/#weight){target=\\_blank}, representing its computational and storage demands. This process is vital for maintaining the blockchain's performance and preventing potential vulnerabilities, such as Denial of Service (DoS) attacks.\n\nThe Polkadot SDK leverages the [FRAME](/polkadot-protocol/glossary/#frame-framework-for-runtime-aggregation-of-modularized-entities){target=\\_blank} benchmarking framework, offering tools to measure and assign weights to extrinsics. These weights help determine the maximum number of transactions or system-level calls processed within a block. This guide covers how to use FRAME's [benchmarking framework](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=\\_blank}, from setting up your environment to writing and running benchmarks for your custom pallets. You'll understand how to generate accurate weights by the end, ensuring your runtime remains performant and secure."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 1, "depth": 2, "title": "The Case for Benchmarking", "anchor": "the-case-for-benchmarking", "start_char": 1221, "end_char": 2015, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## The Case for Benchmarking\n\nBenchmarking helps validate that the required execution time for different functions is within reasonable boundaries to ensure your blockchain runtime can handle transactions efficiently and securely. By accurately measuring the weight of each extrinsic, you can prevent service interruptions caused by computationally intensive calls that exceed block time limits. Without benchmarking, runtime performance could be vulnerable to DoS attacks, where malicious users exploit functions with unoptimized weights.\n\nBenchmarking also ensures predictable transaction fees. Weights derived from benchmark tests accurately reflect the resource usage of function calls, allowing fair fee calculation. This approach discourages abuse while maintaining network reliability."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 2, "depth": 3, "title": "Benchmarking and Weight", "anchor": "benchmarking-and-weight", "start_char": 2015, "end_char": 3681, "estimated_token_count": 321, "token_estimator": "heuristic-v1", "text": "### Benchmarking and Weight \n\nIn Polkadot SDK-based chains, weight quantifies the computational effort needed to process transactions. This weight includes factors such as:\n\n- Computational complexity.\n- Storage complexity (proof size).\n- Database reads and writes.\n- Hardware specifications.\n\nBenchmarking uses real-world testing to simulate worst-case scenarios for extrinsics. The framework generates a linear model for weight calculation by running multiple iterations with varied parameters. These worst-case weights ensure blocks remain within execution limits, enabling the runtime to maintain throughput under varying loads. Excess fees can be refunded if a call uses fewer resources than expected, offering users a fair cost model.\n \nBecause weight is a generic unit of measurement based on computation time for a specific physical machine, the weight of any function can change based on the specifications of hardware used for benchmarking. By modeling the expected weight of each runtime function, the blockchain can calculate the number of transactions or system-level calls it can execute within a certain period.\n\nWithin FRAME, each function call that is dispatched must have a `#[pallet::weight]` annotation that can return the expected weight for the worst-case scenario execution of that function given its inputs:\n\n```rust hl_lines=\"2\"\n#[pallet::call_index(0)]\n#[pallet::weight(T::WeightInfo::do_something())]\npub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n```\n\nThe `WeightInfo` file is automatically generated during benchmarking. Based on these tests, this file provides accurate weights for each extrinsic."} @@ -288,7 +288,7 @@ {"page_id": "develop-parachains-testing-pallet-testing", "page_title": "Pallet Testing", "index": 5, "depth": 3, "title": "Event Testing", "anchor": "event-testing", "start_char": 4108, "end_char": 6129, "estimated_token_count": 519, "token_estimator": "heuristic-v1", "text": "### Event Testing\n\nIt's also crucial to test the events that your pallet emits during execution. By default, events generated in a pallet using the [`#generate_deposit`](https://paritytech.github.io/polkadot-sdk/master/frame_support/pallet_macros/attr.generate_deposit.html){target=\\_blank} macro are stored under the system's event storage key (system/events) as [`EventRecord`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.EventRecord.html){target=\\_blank} entries. These can be accessed using [`System::events()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.events){target=\\_blank} or verified with specific helper methods provided by the system pallet, such as [`assert_has_event`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.assert_has_event){target=\\_blank} and [`assert_last_event`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.assert_last_event){target=\\_blank}.\n\nHere's an example of testing events in a mock runtime:\n\n```rust\n#[test]\nfn it_emits_events_on_success() {\n new_test_ext().execute_with(|| {\n // Call an extrinsic or function\n assert_ok!(TemplateModule::some_function(Origin::signed(1), valid_param));\n\n // Verify that the expected event was emitted\n assert!(System::events().iter().any(|record| {\n record.event == Event::TemplateModule(TemplateEvent::SomeEvent)\n }));\n });\n}\n```\n\nSome key considerations are:\n\n- **Block number**: Events are not emitted on the genesis block, so you need to set the block number using [`System::set_block_number()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.set_block_number){target=\\_blank} to ensure events are triggered.\n- **Converting events**: Use `.into()` when instantiating your pallet's event to convert it into a generic event type, as required by the system's event storage."} {"page_id": "develop-parachains-testing-pallet-testing", "page_title": "Pallet Testing", "index": 6, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 6129, "end_char": 6871, "estimated_token_count": 211, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- Dive into the full implementation of the [`mock.rs`](https://github.com/paritytech/polkadot-sdk/blob/master/templates/solochain/pallets/template/src/mock.rs){target=\\_blank} and [`test.rs`](https://github.com/paritytech/polkadot-sdk/blob/master/templates/solochain/pallets/template/src/tests.rs){target=\\_blank} files in the [Solochain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/solochain){target=_blank}.\n\n
\n\n- Guide __Benchmarking__\n\n ---\n\n Explore methods to measure the performance and execution cost of your pallet.\n\n [:octicons-arrow-right-24: Reference](/develop/parachains/testing/benchmarking)\n\n
"} {"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 479, "end_char": 529, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 529, "end_char": 1243, "estimated_token_count": 199, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 529, "end_char": 1221, "estimated_token_count": 193, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains", "page_title": "Parachains", "index": 0, "depth": 2, "title": "Building Parachains with the Polkadot SDK", "anchor": "building-parachains-with-the-polkadot-sdk", "start_char": 275, "end_char": 1983, "estimated_token_count": 329, "token_estimator": "heuristic-v1", "text": "## Building Parachains with the Polkadot SDK\n\nWith the [Polkadot relay chain](/polkadot-protocol/architecture/polkadot-chain/){target=\\_blank} handling security and consensus, parachain developers are free to focus on features such as asset management, governance, and cross-chain communication. The Polkadot SDK equips developers with the tools to build, deploy, and maintain efficient, scalable parachains.\n\nPolkadot SDK’s FRAME framework provides developers with the tools to do the following:\n\n- **Customize parachain runtimes**: [Runtimes](/polkadot-protocol/glossary/#runtime){target=\\_blank} are the core building blocks that define the logic and functionality of Polkadot SDK-based parachains and let developers customize the parameters, rules, and behaviors that shape their blockchain network.\n- **Develop new pallets**: Create custom modular pallets to define runtime behavior and achieve desired blockchain functionality.\n- **Add smart contract functionality**: Use specialized pallets to deploy and execute smart contracts, enhancing your chain's functionality and programmability.\n- **Test your build for a confident deployment**: Create a test environment that can simulate runtime and mock transaction execution.\n- **Deploy your blockchain for use**: Take your Polkadot SDK-based blockchain from a local environment to production.\n- **Maintain your network including monitoring and upgrades**: Runtimes can be upgraded through forkless runtime updates, enabling seamless evolution of the parachain.\n\nNew to parachain development? Start with the [Introduction to the Polkadot SDK](/develop/parachains/intro-polkadot-sdk/) to discover how this framework simplifies building custom parachains."} {"page_id": "develop-parachains", "page_title": "Parachains", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1983, "end_char": 2032, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-smart-contracts-block-explorers", "page_title": "Block Explorers", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 189, "end_char": 497, "estimated_token_count": 49, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBlock explorers serve as comprehensive blockchain analytics platforms that provide access to on-chain data. These web applications function as search engines for blockchain networks, allowing users to query, visualize, and analyze blockchain data in real time through intuitive interfaces."} @@ -553,7 +553,7 @@ {"page_id": "develop-toolkit-api-libraries-subxt", "page_title": "Subxt Rust API", "index": 8, "depth": 3, "title": "Submit Transactions", "anchor": "submit-transactions", "start_char": 7470, "end_char": 8944, "estimated_token_count": 311, "token_estimator": "heuristic-v1", "text": "### Submit Transactions\n\nTo submit a transaction, you must construct an extrinsic, sign it with your private key, and send it to the blockchain. Replace `INSERT_DEST_ADDRESS` with the recipient's address, `INSERT_AMOUNT` with the amount to transfer, and `INSERT_SECRET_PHRASE` with the sender's mnemonic phrase:\n\n```rust\n // Define the recipient address and transfer amount.\n const DEST_ADDRESS: &str = \"INSERT_DEST_ADDRESS\";\n const AMOUNT: u128 = INSERT_AMOUNT;\n\n // Convert the recipient address into an `AccountId32`.\n let dest = AccountId32::from_str(DEST_ADDRESS).unwrap();\n\n // Build the balance transfer extrinsic.\n let balance_transfer_tx = polkadot::tx()\n .balances()\n .transfer_allow_death(dest.into(), AMOUNT);\n\n // Load the sender's keypair from a mnemonic phrase.\n const SECRET_PHRASE: &str = \"INSERT_SECRET_PHRASE\";\n let mnemonic = Mnemonic::parse(SECRET_PHRASE).unwrap();\n let sender_keypair = Keypair::from_phrase(&mnemonic, None).unwrap();\n\n // Sign and submit the extrinsic, then wait for it to be finalized.\n let events = api\n .tx()\n .sign_and_submit_then_watch_default(&balance_transfer_tx, &sender_keypair)\n .await?\n .wait_for_finalized_success()\n .await?;\n\n // Check for a successful transfer event.\n if let Some(event) = events.find_first::()? {\n println!(\"Balance transfer successful: {:?}\", event);\n }\n```"} {"page_id": "develop-toolkit-api-libraries-subxt", "page_title": "Subxt Rust API", "index": 9, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 8944, "end_char": 9174, "estimated_token_count": 57, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\nNow that you've covered the basics dive into the official [subxt documentation](https://docs.rs/subxt/latest/subxt/book/index.html){target=\\_blank} for comprehensive reference materials and advanced features."} {"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 392, "end_char": 442, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 442, "end_char": 1096, "estimated_token_count": 179, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 442, "end_char": 1074, "estimated_token_count": 173, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 0, "depth": 2, "title": "The Challenge of Blockchain Data Access", "anchor": "the-challenge-of-blockchain-data-access", "start_char": 12, "end_char": 649, "estimated_token_count": 103, "token_estimator": "heuristic-v1", "text": "## The Challenge of Blockchain Data Access\n\nBlockchain data is inherently sequential and distributed, with information stored chronologically across numerous blocks. While retrieving data from a single block through JSON-RPC API calls is straightforward, more complex queries that span multiple blocks present significant challenges:\n\n- Data is scattered and unorganized across the blockchain.\n- Retrieving large datasets can take days or weeks to sync.\n- Complex operations (like aggregations, averages, or cross-chain queries) require additional processing.\n- Direct blockchain queries can impact dApp performance and responsiveness."} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 1, "depth": 2, "title": "What is a Blockchain Indexer?", "anchor": "what-is-a-blockchain-indexer", "start_char": 649, "end_char": 1211, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## What is a Blockchain Indexer?\n\nA blockchain indexer is a specialized infrastructure tool that processes, organizes, and stores blockchain data in an optimized format for efficient querying. Think of it as a search engine for blockchain data that:\n\n- Continuously monitors the blockchain for new blocks and transactions.\n- Processes and categorizes this data according to predefined schemas.\n- Stores the processed data in an easily queryable database.\n- Provides efficient APIs (typically [GraphQL](https://graphql.org/){target=\\_blank}) for data retrieval."} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 2, "depth": 2, "title": "Indexer Implementations", "anchor": "indexer-implementations", "start_char": 1211, "end_char": 2230, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## Indexer Implementations\n\n
\n\n- __Subsquid__\n\n ---\n\n Subsquid is a data network that allows rapid and cost-efficient retrieval of blockchain data from 100+ chains using Subsquid's decentralized data lake and open-source SDK. In simple terms, Subsquid can be considered an ETL (extract, transform, and load) tool with a GraphQL server included. It enables comprehensive filtering, pagination, and even full-text search capabilities. Subsquid has native and full support for EVM and Substrate data, even within the same project.\n\n [:octicons-arrow-right-24: Reference](https://www.sqd.ai/){target=\\_blank}\n\n- __Subquery__\n\n ---\n\n SubQuery is a fast, flexible, and reliable open-source data decentralised infrastructure network that provides both RPC and indexed data to consumers worldwide.\n It provides custom APIs for your web3 project across multiple supported chains.\n\n [:octicons-arrow-right-24: Reference](https://subquery.network/){target=\\_blank}\n\n
"} @@ -623,10 +623,10 @@ {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks-get-started", "page_title": "Get Started", "index": 9, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 10537, "end_char": 10894, "estimated_token_count": 91, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Fork a Chain with Chopsticks__\n\n ---\n\n Visit this guide for step-by-step instructions for configuring and interacting with your forked chain.\n\n [:octicons-arrow-right-24: Reference](/tutorials/polkadot-sdk/testing/fork-live-chains/)\n\n
"} {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 0, "depth": 2, "title": "What Can I Do with Chopsticks?", "anchor": "what-can-i-do-with-chopsticks", "start_char": 299, "end_char": 685, "estimated_token_count": 71, "token_estimator": "heuristic-v1", "text": "## What Can I Do with Chopsticks?\n\n- Create local forks of live networks.\n- Replay blocks to analyze behavior.\n- Test XCM interactions.\n- Simulate complex scenarios.\n- Modify network storage and state.\n\nWhether you're debugging an issue, testing new features, or exploring cross-chain interactions, Chopsticks provides a safe environment for blockchain experimentation and validation."} {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 685, "end_char": 735, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 735, "end_char": 1495, "estimated_token_count": 208, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 735, "end_char": 1473, "estimated_token_count": 202, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 0, "depth": 2, "title": "Why Fork a Live Chain?", "anchor": "why-fork-a-live-chain", "start_char": 512, "end_char": 804, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Why Fork a Live Chain?\n\nForking a live chain creates a controlled environment that mirrors live network conditions. This approach enables you to:\n\n- Test features safely before deployment.\n- Debug complex interactions.\n- Validate runtime changes.\n- Experiment with network modifications."} {"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 804, "end_char": 854, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 854, "end_char": 1295, "estimated_token_count": 120, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 854, "end_char": 1284, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 17, "end_char": 994, "estimated_token_count": 167, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nLight clients enable secure and efficient blockchain interaction without running a full node. They provide a trust-minimized alternative to JSON-RPC by verifying data through cryptographic proofs rather than blindly trusting remote nodes.\n\nThis guide covers:\n\n- What light clients are and how they work.\n- Their advantages compared to full nodes and JSON-RPC.\n- Available implementations in the Polkadot ecosystem.\n- How to use light clients in your applications.\n\nLight clients are particularly valuable for resource-constrained environments and applications requiring secure, decentralized blockchain access without the overhead of maintaining full nodes.\n\n!!!note \"Light node or light client?\"\n The terms _light node_ and _light client_ are interchangeable. Both refer to a blockchain client that syncs without downloading the entire blockchain state. All nodes in a blockchain network are fundamentally clients, engaging in peer-to-peer communication."} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 1, "depth": 2, "title": "Light Clients Workflow", "anchor": "light-clients-workflow", "start_char": 994, "end_char": 2625, "estimated_token_count": 359, "token_estimator": "heuristic-v1", "text": "## Light Clients Workflow\n\nUnlike JSON-RPC interfaces, where an application must maintain a list of providers or rely on a single node, light clients are not limited to or dependent on a single node. They use cryptographic proofs to verify the blockchain's state, ensuring it is up-to-date and accurate. By verifying only block headers, light clients avoid syncing the entire state, making them ideal for resource-constrained environments.\n\n```mermaid\nflowchart LR\nDAPP([dApp])-- Query Account Info -->LC([Light Client])\nLC -- Request --> FN(((Full Node)))\nLC -- Response --> DAPP\nFN -- Response (validated via Merkle proof) --> LC\n```\n\nIn the diagram above, the decentralized application queries on-chain account information through the light client. The light client runs as part of the application and requires minimal memory and computational resources. It uses Merkle proofs to verify the state retrieved from a full node in a trust-minimized manner. Polkadot-compatible light clients utilize [warp syncing](https://spec.polkadot.network/sect-lightclient#sect-sync-warp-lightclient){target=\\_blank}, which downloads only block headers.\n\nLight clients can quickly verify the blockchain's state, including [GRANDPA finality](/polkadot-protocol/glossary#grandpa){target=\\_blank} justifications.\n\n!!!note \"What does it mean to be trust-minimized?\"\n _Trust-minimized_ means that the light client does not need to fully trust the full node from which it retrieves the state. This is achieved through the use of Merkle proofs, which allow the light client to verify the correctness of the state by checking the Merkle tree root."} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 2, "depth": 2, "title": "JSON-RPC and Light Client Comparison", "anchor": "json-rpc-and-light-client-comparison", "start_char": 2625, "end_char": 4478, "estimated_token_count": 442, "token_estimator": "heuristic-v1", "text": "## JSON-RPC and Light Client Comparison\n\nAnother common method of communication between a user interface (UI) and a node is through the JSON-RPC protocol. Generally, the UI retrieves information from the node, fetches network or [pallet](/polkadot-protocol/glossary#pallet){target=\\_blank} data, and interacts with the blockchain. This is typically done in one of two ways:\n\n- **User-controlled nodes**: The UI connects to a node client installed on the user's machine.\n - These nodes are secure, but installation and maintenance can be inconvenient.\n- **Publicly accessible nodes**: The UI connects to a third-party-owned publicly accessible node client.\n - These nodes are convenient but centralized and less secure. Applications must maintain a list of backup nodes in case the primary node becomes unavailable.\n\nWhile light clients still communicate with [full nodes](/polkadot-protocol/glossary#full-node), they offer significant advantages for applications requiring a secure alternative to running a full node:\n\n| Full Node | Light Client |\n| :---------------------------------------------------------------------------------------------: | :------------------------------------------------------------: |\n| Fully verifies all blocks of the chain | Verifies only the authenticity of blocks |\n| Stores previous block data and the chain's storage in a database | Does not require a database |\n| Installation, maintenance, and execution are resource-intensive and require technical expertise | No installation is typically included as part of the application |"} @@ -694,10 +694,10 @@ {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet-write-tests", "page_title": "Write Tests", "index": 7, "depth": 2, "title": "Example Test Files", "anchor": "example-test-files", "start_char": 9313, "end_char": 11297, "estimated_token_count": 504, "token_estimator": "heuristic-v1", "text": "## Example Test Files\n\nThe following example test files define two tests, a small network test and a big network test. Each test defines a network configuration file and credentials to use.\n\nThe tests define assertions to evaluate the network’s metrics and logs. The assertions are defined by sentences in the DSL, which are mapped to tests to run.\n\n```toml title=\"small-network-test.zndsl\"\nDescription = \"Small Network test\"\nNetwork = \"./0000-test-config-small-network.toml\"\nCreds = \"config\"\n\n# Metrics\n[[metrics]]\nnode_roles = 4\nsub_libp2p_is_major_syncing = 0\n\n# Logs\n[[logs]]\nbob_log_line_glob = \"*rted #1*\"\nbob_log_line_regex = \"Imported #[0-9]+\"\n\n```\n\nAnd the second test file:\n\n```toml title=\"big-network-test.zndsl\"\nDescription = \"Big Network test\"\nNetwork = \"./0001-test-config-big-network.toml\"\nCreds = \"config\"\n\n# Metrics\n[[metrics]]\nnode_roles = 4\nsub_libp2p_is_major_syncing = 0\n\n# Logs\n[[logs]]\nbob_log_line_glob = \"*rted #1*\"\nbob_log_line_regex = \"Imported #[0-9]+\"\n\n# Custom JS script\n[[custom_scripts]]\nalice_js_script = { path = \"./0008-custom.js\", condition = \"return is greater than 1\", timeout = 200 }\n\n# Custom TS script\n[[custom_scripts]]\nalice_ts_script = { path = \"./0008-custom-ts.ts\", condition = \"return is greater than 1\", timeout = 200 }\n\n# Backchannel\n[[backchannel]]\nalice_wait_for_name = { use_as = \"X\", timeout = 30 }\n\n# Well-known functions\n[[functions]]\nalice_is_up = true\nalice_parachain_100_registered = { condition = \"within\", timeout = 225 }\nalice_parachain_100_block_height = { condition = \"at least 10\", timeout = 250 }\n\n# Histogram\n[[histogram]]\nalice_polkadot_pvf_execution_time = { min_samples = 2, buckets = [\n \"0.1\",\n \"0.25\",\n \"0.5\",\n \"+Inf\",\n], timeout = 100 }\n\n# System events\n[[system_events]]\nalice_system_event_matches = { pattern = \"\\\"paraId\\\":[0-9]+\", timeout = 10 }\n\n# Tracing\n[[tracing]]\nalice_trace = { traceID = \"94c1501a78a0d83c498cc92deec264d9\", contains = [\n \"answer-chunk-request\",\n \"answer-chunk-request\",\n] }\n\n```"} {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 0, "depth": 2, "title": "What Can I Do with Zombienet?", "anchor": "what-can-i-do-with-zombienet", "start_char": 350, "end_char": 729, "estimated_token_count": 66, "token_estimator": "heuristic-v1", "text": "## What Can I Do with Zombienet?\n\n- Deploy test networks with multiple nodes.\n- Validate network behavior and performance.\n- Monitor metrics and system events.\n- Execute custom test scenarios.\n\nWhether you're building a new parachain or testing runtime upgrades, Zombienet provides the tools needed to ensure your blockchain functions correctly before deployment to production."} {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 729, "end_char": 779, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 779, "end_char": 1237, "estimated_token_count": 115, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} +{"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 779, "end_char": 1226, "estimated_token_count": 112, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} {"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 0, "depth": 2, "title": "Why Spawn a Network?", "anchor": "why-spawn-a-network", "start_char": 448, "end_char": 727, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Why Spawn a Network?\n\nSpawning a network provides a controlled environment to test and validate various aspects of your blockchain. Use these tools to:\n\n- Validate network configurations.\n- Test cross-chain messaging.\n- Verify runtime upgrades.\n- Debug complex interactions."} {"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 727, "end_char": 777, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 777, "end_char": 1199, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} +{"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 777, "end_char": 1188, "estimated_token_count": 105, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} {"page_id": "develop-toolkit-parachains", "page_title": "Parachains", "index": 0, "depth": 2, "title": "Quick Links", "anchor": "quick-links", "start_char": 600, "end_char": 1005, "estimated_token_count": 110, "token_estimator": "heuristic-v1", "text": "## Quick Links\n\n- [Use Pop CLI to start your parachain project](/develop/toolkit/parachains/quickstart/pop-cli/)\n- [Use Zombienet to spawn a chain](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/)\n- [Use Chopsticks to fork a chain](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/)\n- [Use Moonwall to execute E2E testing](/develop/toolkit/parachains/e2e-testing/moonwall/)"} {"page_id": "develop-toolkit-parachains", "page_title": "Parachains", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1005, "end_char": 1054, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-toolkit", "page_title": "Toolkit", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 853, "end_char": 902, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} @@ -794,7 +794,7 @@ {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 3, "depth": 2, "title": "Purge Validator Session Keys", "anchor": "purge-validator-session-keys", "start_char": 1499, "end_char": 2530, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Purge Validator Session Keys\n\nPurging validator session keys is a critical step in removing the association between your validator account and its session keys, which ensures that your account is fully disassociated from validator activities. The `session.purgeKeys` extrinsic removes the reference to your session keys from the stash or staking proxy account that originally set them.\n\nHere are a couple of important things to know about purging keys:\n\n- **Account used to purge keys**: Always use the same account to purge keys you originally used to set them, usually your stash or staking proxy account. Using a different account may leave an unremovable reference to the session keys on the original account, preventing its reaping.\n- **Account reaping issue**: Failing to purge keys will prevent you from reaping (fully deleting) your stash account. If you attempt to transfer tokens without purging, you'll need to rebond, purge the session keys, unbond again, and wait through the unbonding period before any transfer."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 4, "depth": 2, "title": "Unbond Your Tokens", "anchor": "unbond-your-tokens", "start_char": 2530, "end_char": 3228, "estimated_token_count": 142, "token_estimator": "heuristic-v1", "text": "## Unbond Your Tokens\n\nAfter chilling your node and purging session keys, the final step is to unbond your staked tokens. This action removes them from staking and begins the unbonding period (usually 28 days for Polkadot and seven days for Kusama), after which the tokens will be transferable.\n\nTo unbond tokens, go to **Network > Staking > Account Actions** on Polkadot.js Apps. Select your stash account, click on the dropdown menu, and choose **Unbond Funds**. Alternatively, you can use the `staking.unbond` extrinsic if you handle this via a staking proxy account.\n\nOnce the unbonding period is complete, your tokens will be available for use in transactions or transfers outside of staking."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 381, "end_char": 431, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 431, "end_char": 1975, "estimated_token_count": 404, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 431, "end_char": 1931, "estimated_token_count": 392, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 759, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nValidator performance is pivotal in maintaining the security and stability of the Polkadot network. As a validator, optimizing your setup ensures efficient transaction processing, minimizes latency, and maintains system reliability during high-demand periods. Proper configuration and proactive monitoring also help mitigate risks like slashing and service interruptions.\n\nThis guide covers essential practices for managing a validator, including performance tuning techniques, security hardening, and tools for real-time monitoring. Whether you're fine-tuning CPU settings, configuring NUMA balancing, or setting up a robust alert system, these steps will help you build a resilient and efficient validator operation."} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 1, "depth": 2, "title": "Configuration Optimization", "anchor": "configuration-optimization", "start_char": 759, "end_char": 987, "estimated_token_count": 35, "token_estimator": "heuristic-v1", "text": "## Configuration Optimization\n\nFor those seeking to optimize their validator's performance, the following configurations can improve responsiveness, reduce latency, and ensure consistent performance during high-demand periods."} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 2, "depth": 3, "title": "Deactivate Simultaneous Multithreading", "anchor": "deactivate-simultaneous-multithreading", "start_char": 987, "end_char": 2478, "estimated_token_count": 333, "token_estimator": "heuristic-v1", "text": "### Deactivate Simultaneous Multithreading\n\nPolkadot validators operate primarily in single-threaded mode for critical tasks, so optimizing single-core CPU performance can reduce latency and improve stability. Deactivating simultaneous multithreading (SMT) can prevent virtual cores from affecting performance. SMT is called Hyper-Threading on Intel and 2-way SMT on AMD Zen.\n\nTake the following steps to deactivate every other (vCPU) core:\n\n1. Loop though all the CPU cores and deactivate the virtual cores associated with them:\n\n ```bash\n for cpunum in $(cat /sys/devices/system/cpu/cpu*/topology/thread_siblings_list | \\\n cut -s -d, -f2- | tr ',' '\\n' | sort -un)\n do\n echo 0 > /sys/devices/system/cpu/cpu$cpunum/online\n done\n ```\n\n2. To permanently save the changes, add `nosmt=force` to the `GRUB_CMDLINE_LINUX_DEFAULT` variable in `/etc/default/grub`:\n\n ```bash\n sudo nano /etc/default/grub\n # Add to GRUB_CMDLINE_LINUX_DEFAULT\n ```\n\n ```config title=\"/etc/default/grub\"\n GRUB_DEFAULT = 0;\n GRUB_HIDDEN_TIMEOUT = 0;\n GRUB_HIDDEN_TIMEOUT_QUIET = true;\n GRUB_TIMEOUT = 10;\n GRUB_DISTRIBUTOR = `lsb_release -i -s 2> /dev/null || echo Debian`;\n GRUB_CMDLINE_LINUX_DEFAULT = 'nosmt=force';\n GRUB_CMDLINE_LINUX = '';\n ```\n\n3. Update GRUB to apply changes:\n\n ```bash\n sudo update-grub\n ```\n\n4. After the reboot, you should see that half of the cores are offline. To confirm, run:\n\n ```bash\n lscpu --extended\n ```"} @@ -827,14 +827,14 @@ {"page_id": "infrastructure-running-a-validator-operational-tasks-upgrade-your-node", "page_title": "Upgrade a Validator Node", "index": 5, "depth": 3, "title": "Session `N`", "anchor": "session-n", "start_char": 3111, "end_char": 4063, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "### Session `N`\n\n1. **Start Validator B**: Launch a secondary node and wait until it is fully synced with the network. Once synced, start it with the `--validator` flag. This node will now act as Validator B.\n2. **Generate session keys**: Create new session keys specifically for Validator B.\n3. **Submit the `set_key` extrinsic**: Use your staking proxy account to submit a `set_key` extrinsic, linking the session keys for Validator B to your staking setup.\n4. **Record the session**: Make a note of the session in which you executed this extrinsic.\n5. **Wait for session changes**: Allow the current session to end and then wait for two additional full sessions for the new keys to take effect.\n\n!!! warning \"Keep Validator A running\"\n\n It is crucial to keep Validator A operational during this entire waiting period. Since `set_key` does not take effect immediately, turning off Validator A too early may result in chilling or even slashing."} {"page_id": "infrastructure-running-a-validator-operational-tasks-upgrade-your-node", "page_title": "Upgrade a Validator Node", "index": 6, "depth": 3, "title": "Session `N+3`", "anchor": "session-n3", "start_char": 4063, "end_char": 5624, "estimated_token_count": 378, "token_estimator": "heuristic-v1", "text": "### Session `N+3`\n\nAt this stage, Validator B becomes your active validator. You can now safely perform any maintenance tasks on Validator A.\n\nComplete the following steps when you are ready to bring Validator A back online:\n\n1. **Start Validator A**: Launch Validator A, sync the blockchain database, and ensure it is running with the `--validator` flag.\n2. **Generate new session keys for Validator A**: Create fresh session keys for Validator A.\n3. **Submit the `set_key` extrinsic**: Using your staking proxy account, submit a `set_key` extrinsic with the new Validator A session keys.\n4. **Record the session**: Again, make a note of the session in which you executed this extrinsic.\n\nKeep Validator B active until the session during which you executed the `set-key` extrinsic completes plus two additional full sessions have passed. Once Validator A has successfully taken over, you can safely stop Validator B. This process helps ensure a smooth handoff between nodes and minimizes the risk of downtime or penalties. Verify the transition by checking for finalized blocks in the new session. The logs should indicate the successful change, similar to the example below:\n\n
\n INSERT_COMMAND\n 2019-10-28 21:44:13 Applying authority set change scheduled at block #450092\n 2019-10-28 21:44:13 Applying GRANDPA set change to new set with 20 authorities\n \n
"} {"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 593, "end_char": 643, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 643, "end_char": 1520, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 643, "end_char": 1498, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 26, "end_char": 981, "estimated_token_count": 159, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nRunning a validator in the Polkadot ecosystem is essential for maintaining network security and decentralization. Validators are responsible for validating transactions and adding new blocks to the chain, ensuring the system operates smoothly. In return for their services, validators earn rewards. However, the role comes with inherent risks, such as slashing penalties for misbehavior or technical failures. If you’re new to validation, starting on Kusama provides a lower-stakes environment to gain valuable experience before progressing to the Polkadot network.\n\nThis guide covers everything you need to know about becoming a validator, including system requirements, staking prerequisites, and infrastructure setup. Whether you’re deploying on a VPS or running your node on custom hardware, you’ll learn how to optimize your validator for performance and security, ensuring compliance with network standards while minimizing risks."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 981, "end_char": 2390, "estimated_token_count": 296, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nRunning a validator requires solid system administration skills and a secure, well-maintained infrastructure. Below are the primary requirements you need to be aware of before getting started:\n\n- **System administration expertise**: Handling technical anomalies and maintaining node infrastructure is critical. Validators must be able to troubleshoot and optimize their setup.\n- **Security**: Ensure your setup follows best practices for securing your node. Refer to the [Secure Your Validator](/infrastructure/running-a-validator/operational-tasks/general-management/#secure-your-validator){target=\\_blank} section to learn about important security measures.\n- **Network choice**: Start with [Kusama](/infrastructure/running-a-validator/onboarding-and-offboarding/set-up-validator/#run-a-kusama-validator){target=\\_blank} to gain experience. Look for \"Adjustments for Kusama\" throughout these guides for tips on adapting the provided instructions for the Kusama network.\n- **Staking requirements**: A minimum amount of native token (KSM or DOT) is required to be elected into the validator set. The required stake can come from your own holdings or from nominators.\n- **Risk of slashing**: Any DOT you stake is at risk if your setup fails or your validator misbehaves. If you’re unsure of your ability to maintain a reliable validator, consider nominating your DOT to a trusted validator."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 2, "depth": 2, "title": "Minimum Hardware Requirements", "anchor": "minimum-hardware-requirements", "start_char": 2390, "end_char": 3554, "estimated_token_count": 251, "token_estimator": "heuristic-v1", "text": "## Minimum Hardware Requirements\n\nPolkadot validators rely on high-performance hardware to process blocks efficiently. The recommended minimum hardware requirements to ensure a fully functional and performant validator are as follows:\n\n- CPU:\n\n - x86-64 compatible.\n - Eight physical cores @ 3.4 GHz.\n - Processor:\n - **Intel**: Ice Lake or newer (Xeon or Core series)\n - **AMD**: Zen3 or newer (EPYC or Ryzen)\n - Simultaneous multithreading disabled:\n - **Intel**: Hyper-Threading\n - **AMD**: SMT\n - [Single-threaded performance](https://www.cpubenchmark.net/singleThread.html){target=\\_blank} is prioritized over higher cores count.\n\n- Storage:\n\n - **NVMe SSD**: At least 2 TB for blockchain data recommended (prioritize latency rather than throughput).\n - Storage requirements will increase as the chain grows. For current estimates, see the [current chain snapshot](https://stakeworld.io/docs/dbsize){target=\\_blank}.\n\n- Memory:\n\n - 32 GB DDR4 ECC\n\n- Network:\n\n - Symmetric networking speed of 500 Mbit/s is required to handle large numbers of parachains and ensure congestion control during peak times."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 3, "depth": 2, "title": "VPS Provider List", "anchor": "vps-provider-list", "start_char": 3554, "end_char": 6073, "estimated_token_count": 575, "token_estimator": "heuristic-v1", "text": "## VPS Provider List\n\nWhen selecting a VPS provider for your validator node, prioritize reliability, consistent performance, and adherence to the specific hardware requirements set for Polkadot validators. The following server types have been tested and showed acceptable performance in benchmark tests. However, this is not an endorsement and actual performance may vary depending on your workload and VPS provider.\n\nBe aware that some providers may overprovision the underlying host and use shared storage such as NVMe over TCP, which appears as local storage. These setups might result in poor or inconsistent performance. Benchmark your infrastructure before deploying.\n\n- **[Google Cloud Platform (GCP)](https://cloud.google.com/){target=\\_blank}**: `c2` and `c2d` machine families offer high-performance configurations suitable for validators.\n- **[Amazon Web Services (AWS)](https://aws.amazon.com/){target=\\_blank}**: `c6id` machine family provides strong performance, particularly for I/O-intensive workloads.\n- **[OVH](https://www.ovhcloud.com/en-au/){target=\\_blank}**: Can be a budget-friendly solution if it meets your minimum hardware specifications.\n- **[Digital Ocean](https://www.digitalocean.com/){target=\\_blank}**: Popular among developers, Digital Ocean's premium droplets offer configurations suitable for medium to high-intensity workloads.\n- **[Vultr](https://www.vultr.com/){target=\\_blank}**: Offers flexibility with plans that may meet validator requirements, especially for high-bandwidth needs.\n- **[Linode](https://www.linode.com/){target=\\_blank}**: Provides detailed documentation, which can be helpful for setup.\n- **[Scaleway](https://www.scaleway.com/en/){target=\\_blank}**: Offers high-performance cloud instances that can be suitable for validator nodes.\n- **[OnFinality](https://onfinality.io/en){target=\\_blank}**: Specialized in blockchain infrastructure, OnFinality provides validator-specific support and configurations.\n\n!!! warning \"Acceptable use policies\"\n Different VPS providers have varying acceptable use policies, and not all allow cryptocurrency-related activities. \n\n For example, Digital Ocean, requires explicit permission to use servers for cryptocurrency mining and defines unauthorized mining as [network abuse](https://www.digitalocean.com/legal/acceptable-use-policy#network-abuse){target=\\_blank} in their acceptable use policy. \n \n Review the terms for your VPS provider to avoid account suspension or server shutdown due to policy violations."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 4, "depth": 2, "title": "Minimum Bond Requirement", "anchor": "minimum-bond-requirement", "start_char": 6073, "end_char": 6838, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Minimum Bond Requirement\n\nBefore bonding DOT, ensure you meet the minimum bond requirement to start a validator instance. The minimum bond is the least DOT you need to stake to enter the validator set. To become eligible for rewards, your validator node must be nominated by enough staked tokens.\n\nFor example, on November 19, 2024, the minimum stake backing a validator in Polkadot's era 1632 was 1,159,434.248 DOT. You can check the current minimum stake required using these tools:\n\n- [**Chain State Values**](https://wiki.polkadot.com/general/chain-state-values/){target=\\_blank}\n- [**Subscan**](https://polkadot.subscan.io/validator_list?status=validator){target=\\_blank}\n- [**Staking Dashboard**](https://staking.polkadot.cloud/#/overview){target=\\_blank}"} {"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 412, "end_char": 462, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 462, "end_char": 1603, "estimated_token_count": 307, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 462, "end_char": 1570, "estimated_token_count": 298, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 24, "end_char": 674, "estimated_token_count": 104, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn Polkadot's Nominated Proof of Stake (NPoS) system, validator misconduct is deterred through a combination of slashing, disabling, and reputation penalties. Validators and nominators who stake tokens face consequences for validator misbehavior, which range from token slashes to restrictions on network participation.\n\nThis page outlines the types of offenses recognized by Polkadot, including block equivocations and invalid votes, as well as the corresponding penalties. While some parachains may implement additional custom slashing mechanisms, this guide focuses on the offenses tied to staking within the Polkadot ecosystem."} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 1, "depth": 2, "title": "Offenses", "anchor": "offenses", "start_char": 674, "end_char": 1106, "estimated_token_count": 86, "token_estimator": "heuristic-v1", "text": "## Offenses\n\nPolkadot is a public permissionless network. As such, it has a mechanism to disincentivize offenses and incentivize good behavior. You can review the [parachain protocol](https://wiki.polkadot.com/learn/learn-parachains-protocol/#parachain-protocol){target=\\_blank} to understand better the terminology used to describe offenses. Polkadot validator offenses fall into two categories: invalid votes and equivocations."} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 2, "depth": 3, "title": "Invalid Votes", "anchor": "invalid-votes", "start_char": 1106, "end_char": 1733, "estimated_token_count": 128, "token_estimator": "heuristic-v1", "text": "### Invalid Votes\n\nA validator will be penalized for inappropriate voting activity during the block inclusion and approval processes. The invalid voting related offenses are as follows:\n\n- **Backing an invalid block**: A para-validator backs an invalid block for inclusion in a fork of the relay chain.\n- **`ForInvalid` vote**: When acting as a secondary checker, the validator votes in favor of an invalid block.\n- **`AgainstValid` vote**: When acting as a secondary checker, the validator votes against a valid block. This type of vote wastes network resources required to resolve the disparate votes and resulting dispute."} @@ -851,7 +851,7 @@ {"page_id": "infrastructure-staking-mechanics-rewards-payout", "page_title": "Rewards Payout", "index": 4, "depth": 2, "title": "Running Multiple Validators", "anchor": "running-multiple-validators", "start_char": 5622, "end_char": 7233, "estimated_token_count": 423, "token_estimator": "heuristic-v1", "text": "## Running Multiple Validators\n\nRunning multiple validators can offer a more favorable risk/reward ratio compared to running a single one. If you have sufficient DOT or nominators staking on your validators, maintaining multiple validators within the active set can yield higher rewards.\n\nIn the preceding section, with 18 DOT staked and no nominators, Alice earned 2 DOT in one era. This example uses DOT, but the same principles apply for KSM on the Kusama network. By managing stake across multiple validators, you can potentially increase overall returns. Recall the set of validators from the preceding section:\n\n``` mermaid\nflowchart TD\n A[\"Alice (18 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n D[\"Dave (7 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> D \n```\n\nNow, assume Alice decides to split their stake and run two validators, each with a nine DOT stake. This validator set only has four spots and priority is given to validators with a larger stake. In this example, Dave has the smallest stake and loses his spot in the validator set. Now, Alice will earn two shares of the total payout each era as illustrated below:\n\n``` mermaid\nflowchart TD\n A[\"Alice (9 DOT)\"]\n F[\"Alice (9 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> F \n```\n\nWith enough stake, you could run more than two validators. However, each validator must have enough stake behind it to maintain a spot in the validator set."} {"page_id": "infrastructure-staking-mechanics-rewards-payout", "page_title": "Rewards Payout", "index": 5, "depth": 2, "title": "Nominators and Validator Payments", "anchor": "nominators-and-validator-payments", "start_char": 7233, "end_char": 11070, "estimated_token_count": 990, "token_estimator": "heuristic-v1", "text": "## Nominators and Validator Payments\n\nA nominator's stake allows them to vote for validators and earn a share of the rewards without managing a validator node. Although staking rewards depend on validator activity during an era, validators themselves never control or own nominator rewards. To trigger payouts, anyone can call the `staking.payoutStakers` or `staking.payoutStakerByPage` methods, which mint and distribute rewards directly to the recipients. This trustless process ensures nominators receive their earned rewards.\n\nValidators set a commission rate as a percentage of the block reward, affecting how rewards are shared with nominators. A 0% commission means the validator keeps only rewards from their self-stake, while a 100% commission means they retain all rewards, leaving none for nominators.\n\nThe following examples model splitting validator payments between nominator and validator using various commission percentages. For simplicity, these examples assume a Polkadot-SDK based relay chain that uses DOT as a native token and a single nominator per validator. Calculations of KSM reward payouts for Kusama follow the same formula. \n\nStart with the original validator set from the previous section: \n\n``` mermaid\nflowchart TD\n A[\"Alice (18 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n D[\"Dave (7 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> D \n```\n\nThe preceding diagram shows each validator receiving a 2 DOT payout, but doesn't account for sharing rewards with nominators. The following diagram shows what nominator payout might look like for validator Alice. Alice has a 20% commission rate and holds 50% of the stake for their validator:\n\n``` mermaid\n\nflowchart TD\n A[\"Gross Rewards = 2 DOT\"]\n E[\"Commission = 20%\"]\n F[\"Alice Validator Payment = 0.4 DOT\"]\n G[\"Total Stake Rewards = 1.6 DOT\"]\n B[\"Alice Validator Stake = 18 DOT\"]\n C[\"9 DOT Alice (50%)\"]\n H[\"Alice Stake Reward = 0.8 DOT\"]\n I[\"Total Alice Validator Reward = 1.2 DOT\"]\n D[\"9 DOT Nominator (50%)\"]\n J[\"Total Nominator Reward = 0.8 DOT\"]\n \n A --> E\n E --(2 x 0.20)--> F\n F --(2 - 0.4)--> G\n B --> C\n B --> D\n C --(1.6 x 0.50)--> H\n H --(0.4 + 0.8)--> I\n D --(1.60 x 0.50)--> J\n```\n\nNotice the validator commission rate is applied against the gross amount of rewards for the era. The validator commission is subtracted from the total rewards. After the commission is paid to the validator, the remaining amount is split among stake owners according to their percentage of the total stake. A validator's total rewards for an era include their commission plus their piece of the stake rewards. \n\nNow, consider a different scenario for validator Bob where the commission rate is 40%, and Bob holds 33% of the stake for their validator:\n\n``` mermaid\n\nflowchart TD\n A[\"Gross Rewards = 2 DOT\"]\n E[\"Commission = 40%\"]\n F[\"Bob Validator Payment = 0.8 DOT\"]\n G[\"Total Stake Rewards = 1.2 DOT\"]\n B[\"Bob Validator Stake = 9 DOT\"]\n C[\"3 DOT Bob (33%)\"]\n H[\"Bob Stake Reward = 0.4 DOT\"]\n I[\"Total Bob Validator Reward = 1.2 DOT\"]\n D[\"6 DOT Nominator (67%)\"]\n J[\"Total Nominator Reward = 0.8 DOT\"]\n \n A --> E\n E --(2 x 0.4)--> F\n F --(2 - 0.8)--> G\n B --> C\n B --> D\n C --(1.2 x 0.33)--> H\n H --(0.8 + 0.4)--> I\n D --(1.2 x 0.67)--> J\n```\n\nBob holds a smaller percentage of their node's total stake, making their stake reward smaller than Alice's. In this scenario, Bob makes up the difference by charging a 40% commission rate and ultimately ends up with the same total payment as Alice. Each validator will need to find their ideal balance between the amount of stake and commission rate to attract nominators while still making running a validator worthwhile."} {"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 487, "end_char": 537, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 537, "end_char": 1824, "estimated_token_count": 340, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 537, "end_char": 1791, "estimated_token_count": 331, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure", "page_title": "Infrastructure", "index": 0, "depth": 2, "title": "Choosing the Right Role", "anchor": "choosing-the-right-role", "start_char": 486, "end_char": 2813, "estimated_token_count": 439, "token_estimator": "heuristic-v1", "text": "## Choosing the Right Role\n\nSelecting your role within the Polkadot ecosystem depends on your goals, resources, and expertise. Below are detailed considerations for each role:\n\n- **Running a node**:\n - **Purpose**: A node provides access to network data and supports API queries. It is commonly used for.\n - **Development and testing**: Offers a local instance to simulate network conditions and test applications.\n - **Production use**: Acts as a data source for dApps, clients, and other applications needing reliable access to the blockchain.\n - **Requirements**: Moderate hardware resources to handle blockchain data efficiently.\n - **Responsibilities**: A node’s responsibilities vary based on its purpose.\n - **Development and testing**: Enables developers to test features, debug code, and simulate network interactions in a controlled environment.\n - **Production use**: Provides consistent and reliable data access for dApps and other applications, ensuring minimal downtime.\n\n- **Running a validator**:\n - **Purpose**: Validators play a critical role in securing the Polkadot relay chain. They validate parachain block submissions, participate in consensus, and help maintain the network's overall integrity.\n - **Requirements**: Becoming a validator requires.\n - **Staking**: A variable amount of DOT tokens to secure the network and demonstrate commitment.\n - **Hardware**: High-performing hardware resources capable of supporting intensive blockchain operations.\n - **Technical expertise**: Proficiency in setting up and maintaining nodes, managing updates, and understanding Polkadot's consensus mechanisms.\n - **Community involvement**: Building trust and rapport within the community to attract nominators willing to stake with your validator.\n - **Responsibilities**: Validators have critical responsibilities to ensure network health.\n - **Uptime**: Maintain near-constant availability to avoid slashing penalties for downtime or unresponsiveness.\n - **Network security**: Participate in consensus and verify parachain transactions to uphold the network's security and integrity.\n - **Availability**: Monitor the network for events and respond to issues promptly, such as misbehavior reports or protocol updates."} {"page_id": "infrastructure", "page_title": "Infrastructure", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2813, "end_char": 2862, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "polkadot-protocol-architecture-parachains-consensus", "page_title": "Parachain Consensus", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 23, "end_char": 936, "estimated_token_count": 146, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nParachains are independent blockchains built with the Polkadot SDK, designed to leverage Polkadot’s relay chain for shared security and transaction finality. These specialized chains operate as part of Polkadot’s execution sharding model, where each parachain manages its own state and transactions while relying on the relay chain for validation and consensus.\n\nAt the core of parachain functionality are collators, specialized nodes that sequence transactions into blocks and maintain the parachain’s state. Collators optimize Polkadot’s architecture by offloading state management from the relay chain, allowing relay chain validators to focus solely on validating parachain blocks.\n\nThis guide explores how parachain consensus works, including the roles of collators and validators, and the steps involved in securing parachain blocks within Polkadot’s scalable and decentralized framework."} @@ -1239,7 +1239,7 @@ {"page_id": "tutorials-dapps-remark-tutorial", "page_title": "PAPI Account Watcher Tutorial", "index": 7, "depth": 2, "title": "Test the CLI", "anchor": "test-the-cli", "start_char": 6108, "end_char": 7721, "estimated_token_count": 521, "token_estimator": "heuristic-v1", "text": "## Test the CLI\n\nTo test the application, navigate to the [**Extrinsics** page of the PAPI Dev Console](https://dev.papi.how/extrinsics#networkId=westend&endpoint=light-client){target=\\_blank}. Select the **System** pallet and the **remark_with_event** call. Ensure the input field follows the convention `address+email`. For example, if monitoring `5GrwvaEF5zXb26Fz9rcQpDWS57CtERHpNehXCPcNoHGKutQY`, the input should be:\n\n![](/images/tutorials/dapps/remark-tutorial/papi-console.webp)\n\nSubmit the extrinsic and sign it using the Polkadot.js browser wallet. The CLI will display the following output and play the \"You've Got Mail!\" sound:\n\n
\n npm start -- --account 5GrwvaEF5zXb26Fz9rcQpDWS57CtERHpNehXCPcNoHGKutQY\n __ __ _ _____ __ __ _ _ __ __ _ _\n \\ \\ / /__| |__|___ / | \\/ | __ _(_) | \\ \\ / /_ _| |_ ___| |__ ___ _ __\n \\ \\ /\\ / / _ \\ '_ \\ |_ \\ | |\\/| |/ _` | | | \\ \\ /\\ / / _` | __/ __| '_ \\ / _ \\ '__|\n \\ V V / __/ |_) |__) | | | | | (_| | | | \\ V V / (_| | || (__| | | | __/ |\n \\_/\\_/ \\___|_.__/____/ |_| |_|\\__,_|_|_| \\_/\\_/ \\__,_|\\__\\___|_| |_|\\___|_|\n \n 📬 Watching account: 5Cm8yiG45rqrpyV2zPLrbtr8efksrRuCXcqcB4xj8AejfcTB\n 📥 You've got mail!\n 👤 From: 5Cm8yiG45rqrpyV2zPLrbtr8efksrRuCXcqcB4xj8AejfcTB\n 🔖 Hash: 0xb6999c9082f5b1dede08b387404c9eb4eb2deee4781415dfa7edf08b87472050\n
"} {"page_id": "tutorials-dapps-remark-tutorial", "page_title": "PAPI Account Watcher Tutorial", "index": 8, "depth": 2, "title": "Next Steps", "anchor": "next-steps", "start_char": 7721, "end_char": 8055, "estimated_token_count": 69, "token_estimator": "heuristic-v1", "text": "## Next Steps\n\nThis application demonstrates how the Polkadot API can be used to build decentralized applications. While this is not a production-grade application, it introduces several key features for developing with the Polkadot API.\n\nTo explore more, refer to the [official PAPI documentation](https://papi.how){target=\\_blank}."} {"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 491, "end_char": 541, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 541, "end_char": 1220, "estimated_token_count": 190, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 541, "end_char": 1198, "estimated_token_count": 184, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 44, "end_char": 735, "estimated_token_count": 150, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn this tutorial, you'll learn how to replay and dry-run XCMs using [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/){target=\\_blank}, a powerful tool for forking live Polkadot SDK-based chains in your local environment. These techniques are essential for:\n\n- Debugging cross-chain message failures.\n- Tracing execution across relay chains and parachains.\n- Analyzing weight usage, error types, and message flow.\n- Safely simulating XCMs without committing state changes.\n\nBy the end of this guide, you'll be able to set up a local fork, capture and replay real XCMs, and use dry-run features to diagnose and resolve complex cross-chain issues."} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 735, "end_char": 1478, "estimated_token_count": 199, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, make sure you have:\n\n- [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/){target=\\_blank} installed (`npm i -g @acala-network/chopsticks`).\n- Access to the endpoint or genesis file of the parachain you want to fork.\n- The block number or hash where the XCM was sent.\n- (Optional) A Chopsticks config file for repeated setups.\n\nIf you haven't forked a chain before, see the [Fork a Chain with Chopsticks guide](/tutorials/polkadot-sdk/testing/fork-live-chains/){target=\\_blank} or [Fork a Network Locally using Chopsticks](https://wiki.polkadot.com/learn/learn-guides-test-opengov-proposals/#fork-a-network-locally-using-chopsticks){target=\\_blank} for step-by-step instructions."} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 2, "depth": 2, "title": "Set Up Your Project", "anchor": "set-up-your-project", "start_char": 1478, "end_char": 2310, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Set Up Your Project\n\nLet's start by creating a dedicated workspace for your XCM replay and dry-run experiments.\n\n1. Create a new directory and navigate into it:\n\n ```bash\n mkdir -p replay-xcm-tests\n cd replay-xcm-tests\n ```\n\n2. Initialize a new Node project:\n\n ```bash\n npm init -y\n ```\n\n3. Install Chopsticks globally (recommended to avoid conflicts with local installs):\n\n ```bash\n npm install -g @acala-network/chopsticks@latest\n ```\n\n4. Install TypeScript and related tooling for local development:\n\n ```bash\n npm install --save-dev typescript @types/node tsx\n ```\n\n5. Install the required Polkadot packages:\n\n ```bash\n npm install polkadot-api @polkadot-labs/hdkd @polkadot-labs/hdkd-helpers\n ```\n\n6. Initialize the TypeScript config:\n\n ```bash\n npx tsc --init\n ```"} @@ -1276,7 +1276,7 @@ {"page_id": "tutorials-interoperability-xcm-channels-para-to-system", "page_title": "Opening HRMP Channels with System Parachains", "index": 5, "depth": 3, "title": "Craft and Submit the XCM Message", "anchor": "craft-and-submit-the-xcm-message", "start_char": 3780, "end_char": 7208, "estimated_token_count": 685, "token_estimator": "heuristic-v1", "text": "### Craft and Submit the XCM Message\n\nConnect to parachain 2500 using Polkadot.js Apps to send the XCM message to the relay chain. Input the necessary parameters as illustrated in the image below. Make sure to:\n\n1. Insert your previously encoded `establish_channel_with_system` call data into the **`call`** field.\n2. Provide beneficiary details.\n3. Dispatch the XCM message to the relay chain by clicking the **Submit Transaction** button.\n\n![](/images/tutorials/interoperability/xcm-channels/para-to-system/hrmp-para-to-system-2.webp)\n\n!!! note\n The exact process and parameters for submitting this XCM message may vary depending on your specific parachain and relay chain configurations. Always refer to the most current documentation for your particular network setup.\n\nAfter successfully submitting the XCM message to the relay chain, two [`HrmpSystemChannelOpened`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_parachains/hrmp/pallet/enum.Event.html#variant.HrmpSystemChannelOpened){target=\\_blank} events are emitted, indicating that the channels are now present in storage under [`HrmpOpenChannelRequests`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_parachains/hrmp/pallet/storage_types/struct.HrmpOpenChannelRequests.html){target=\\_blank}. However, the channels are not actually set up until the start of the next session, at which point bidirectional communication between parachain 2500 and system chain 1000 is established.\n\nTo verify this, wait for the next session and then follow these steps:\n\n1. Using Polkadot.js Apps, connect to the relay chain and navigate to the **Developer** dropdown, then select **Chain state**.\n\n ![](/images/tutorials/interoperability/xcm-channels/hrmp-channels-1.webp)\n\n2. Query the HRMP channels:\n\n 1. Select **`hrmp`** from the options.\n 2. Choose the **`hrmpChannels`** call.\n 3. Click the **+** button to execute the query.\n\n ![](/images/tutorials/interoperability/xcm-channels/para-to-system/hrmp-para-to-system-3.webp)\n \n3. Examine the query results. You should see output similar to the following:\n\n ```json\n [\n [\n [\n {\n \"sender\": 1000,\n \"recipient\": 2500\n }\n ],\n {\n \"maxCapacity\": 8,\n \"maxTotalSize\": 8192,\n \"maxMessageSize\": 1048576,\n \"msgCount\": 0,\n \"totalSize\": 0,\n \"mqcHead\": null,\n \"senderDeposit\": 0,\n \"recipientDeposit\": 0\n }\n ],\n [\n [\n {\n \"sender\": 2500,\n \"recipient\": 1000\n }\n ],\n {\n \"maxCapacity\": 8,\n \"maxTotalSize\": 8192,\n \"maxMessageSize\": 1048576,\n \"msgCount\": 0,\n \"totalSize\": 0,\n \"mqcHead\": null,\n \"senderDeposit\": 0,\n \"recipientDeposit\": 0\n }\n ]\n ]\n\n ```\n\nThe output confirms the successful establishment of two HRMP channels:\n\n- From chain 1000 (system chain) to chain 2500 (parachain).\n- From chain 2500 (parachain) to chain 1000 (system chain).\n\nThis bidirectional channel enables direct communication between the system chain and the parachain, allowing for cross-chain message passing."} {"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 0, "depth": 2, "title": "Understand the Process of Opening Channels", "anchor": "understand-the-process-of-opening-channels", "start_char": 787, "end_char": 1357, "estimated_token_count": 95, "token_estimator": "heuristic-v1", "text": "## Understand the Process of Opening Channels\n\nEach parachain starts with two default unidirectional XCM channels: an upward channel for sending messages to the relay chain, and a downward channel for receiving messages. These channels are implicitly available.\n\nTo enable communication between parachains, explicit HRMP channels must be established by registering them on the relay chain. This process requires a deposit to cover the costs associated with storing message queues on the relay chain. The deposit amount depends on the specific relay chain’s parameters."} {"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1357, "end_char": 1407, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1407, "end_char": 1808, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1407, "end_char": 1797, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 450, "estimated_token_count": 76, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nWhen sending cross-chain messages, ensure that the transaction will be successful not only in the local chain but also in the destination chain and any intermediate chains.\n\nSending cross-chain messages requires estimating the fees for the operation. \n\nThis tutorial will demonstrate how to dry-run and estimate the fees for teleporting assets from the Paseo Asset Hub parachain to the Paseo Bridge Hub chain."} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 1, "depth": 2, "title": "Fee Mechanism", "anchor": "fee-mechanism", "start_char": 450, "end_char": 1437, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## Fee Mechanism\n\nThere are three types of fees that can be charged when sending a cross-chain message:\n\n- **Local execution fees**: Fees charged in the local chain for executing the message.\n- **Delivery fees**: Fees charged for delivering the message to the destination chain.\n- **Remote execution fees**: Fees charged in the destination chain for executing the message.\n\nIf there are multiple intermediate chains, delivery fees and remote execution fees will be charged for each one.\n\nIn this example, you will estimate the fees for teleporting assets from the Paseo Asset Hub parachain to the Paseo Bridge Hub chain. The fee structure will be as follows:\n\n```mermaid\nflowchart LR\n AssetHub[Paseo Asset Hub] -->|Delivery Fees| BridgeHub[Paseo Bridge Hub]\n AssetHub -->|
Local
Execution
Fees| AssetHub\n BridgeHub -->|
Remote
Execution
Fees| BridgeHub\n```\n\nThe overall fees are `local_execution_fees` + `delivery_fees` + `remote_execution_fees`."} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 2, "depth": 2, "title": "Environment Setup", "anchor": "environment-setup", "start_char": 1437, "end_char": 3989, "estimated_token_count": 588, "token_estimator": "heuristic-v1", "text": "## Environment Setup\n\nFirst, you need to set up your environment:\n\n1. Create a new directory and initialize the project:\n\n ```bash\n mkdir xcm-fee-estimation && \\\n cd xcm-fee-estimation\n ```\n\n2. Initialize the project:\n\n ```bash\n npm init -y\n ```\n\n3. Install dev dependencies:\n\n ```bash\n npm install --save-dev @types/node@^22.12.0 ts-node@^10.9.2 typescript@^5.7.3\n ```\n\n4. Install dependencies:\n\n ```bash\n npm install --save @polkadot-labs/hdkd@^0.0.13 @polkadot-labs/hdkd-helpers@^0.0.13 polkadot-api@1.9.5\n ```\n\n5. Create TypeScript configuration:\n\n ```bash\n npx tsc --init\n ```\n\n6. Generate the types for the Polkadot API for Paseo Bridge Hub and Paseo Asset Hub:\n\n ```bash\n npx papi add paseoAssetHub -n paseo_asset_hub && \\\n npx papi add paseoBridgeHub -w wss://bridge-hub-paseo.dotters.network\n ```\n\n7. Create a new file called `teleport-ah-to-bridge-hub.ts`:\n\n ```bash\n touch teleport-ah-to-bridge-hub.ts\n ```\n\n8. Import the necessary modules. Add the following code to the `teleport-ah-to-bridge-hub.ts` file:\n\n ```typescript title=\"teleport-ah-to-bridge-hub.ts\"\n import { paseoAssetHub, paseoBridgeHub } from '@polkadot-api/descriptors';\n import { createClient, FixedSizeBinary, Enum } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/node';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedLocation,\n XcmVersionedAssetId,\n XcmV3Junctions,\n XcmV3MultiassetFungibility,\n XcmVersionedXcm,\n XcmV5Instruction,\n XcmV5Junctions,\n XcmV5Junction,\n XcmV5AssetFilter,\n XcmV5WildAsset,\n } from '@polkadot-api/descriptors';\n ```\n\n9. Define constants and a `main` function where you will implement all the logic:\n\n ```typescript title=\"teleport-ah-to-bridge-hub.ts\"\n // 1 PAS = 10^10 units\n const PAS_UNITS = 10_000_000_000n; // 1 PAS\n const PAS_CENTS = 100_000_000n; // 0.01 PAS\n\n // Paseo Asset Hub constants\n const PASEO_ASSET_HUB_RPC_ENDPOINT = 'ws://localhost:8001';\n const ASSET_HUB_ACCOUNT = '15oF4uVJwmo4TdGW7VfQxNLavjCXviqxT9S1MgbjMNHr6Sp5'; // Alice (Paseo Asset Hub)\n\n // Bridge Hub destination\n const BRIDGE_HUB_RPC_ENDPOINT = 'ws://localhost:8000';\n const BRIDGE_HUB_PARA_ID = 1002;\n const BRIDGE_HUB_BENEFICIARY =\n async function main() {\n // Code will go here\n }\n ```\n\nAll the following code explained in the subsequent sections must be added inside the `main` function."} @@ -1299,7 +1299,7 @@ {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 0, "depth": 2, "title": "XCM (Cross-Consensus Messaging)", "anchor": "xcm-cross-consensus-messaging", "start_char": 645, "end_char": 894, "estimated_token_count": 43, "token_estimator": "heuristic-v1", "text": "## XCM (Cross-Consensus Messaging)\n\nXCM provides a secure and trustless framework that facilitates communication between parachains, relay chains, and external blockchains, enabling asset transfers, data sharing, and complex cross-chain workflows."} {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 1, "depth": 3, "title": "For Parachain Integrators", "anchor": "for-parachain-integrators", "start_char": 894, "end_char": 1363, "estimated_token_count": 100, "token_estimator": "heuristic-v1", "text": "### For Parachain Integrators\n\nLearn to establish and use cross-chain communication channels:\n\n- **[Opening HRMP Channels Between Parachains](/tutorials/interoperability/xcm-channels/para-to-para/)**: Set up uni- and bidirectional messaging channels between parachains.\n- **[Opening HRMP Channels with System Parachains](/tutorials/interoperability/xcm-channels/para-to-system/)**: Establish communication channels with system parachains using optimized XCM messages."} {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1363, "end_char": 1413, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1413, "end_char": 2197, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1413, "end_char": 2175, "estimated_token_count": 188, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 36, "end_char": 1714, "estimated_token_count": 314, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nPolkadot's [OpenGov](/polkadot-protocol/onchain-governance/overview){target=\\_blank} is a sophisticated governance mechanism designed to allow the network to evolve gracefully over time, guided by its stakeholders. This system features multiple [tracks](https://wiki.polkadot.com/learn/learn-polkadot-opengov-origins/#origins-and-tracks-info){target=\\_blank} for different types of proposals, each with parameters for approval, support, and confirmation period. While this flexibility is powerful, it also introduces complexity that can lead to failed proposals or unexpected outcomes.\n\nTesting governance proposals before submission is crucial for the ecosystem. This process enhances efficiency by reducing the need for repeated submissions, improves security by identifying potential risks, and allows proposal optimization based on simulated outcomes. It also serves as an educational tool, providing stakeholders with a safe environment to understand the impacts of different voting scenarios. \n\nBy leveraging simulation tools like [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks){target=\\_blank}, developers can:\n\n- Simulate the entire lifecycle of a proposal.\n- Test the voting outcomes by varying the support and approval levels.\n- Analyze the effects of a successfully executed proposal on the network's state.\n- Identify and troubleshoot potential issues or unexpected consequences before submitting the proposals.\n\nThis tutorial will guide you through using Chopsticks to test OpenGov proposals thoroughly. This ensures that when you submit a proposal to the live network, you can do so with confidence in its effects and viability."} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1714, "end_char": 2238, "estimated_token_count": 130, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore proceeding, ensure the following prerequisites are met:\n\n- **Chopsticks installation**: If you have not installed Chopsticks yet, refer to the [Install Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/#install-chopsticks){target=\\_blank} guide for detailed instructions.\n- **Familiarity with key concepts**:\n - [Polkadot.js](/develop/toolkit/api-libraries/polkadot-js-api){target=\\_blank}\n - [OpenGov](/polkadot-protocol/onchain-governance/overview){target=\\_blank}"} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2238, "end_char": 3770, "estimated_token_count": 327, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nBefore testing OpenGov proposals, you need to set up your development environment. \nYou'll set up a TypeScript project and install the required dependencies to simulate and evaluate proposals. You'll use Chopsticks to fork the Polkadot network and simulate the proposal lifecycle, while Polkadot.js will be your interface for interacting with the forked network and submitting proposals.\n\nFollow these steps to set up your project:\n\n1. Create a new project directory and navigate into it:\n ```bash\n mkdir opengov-chopsticks && cd opengov-chopsticks\n ```\n\n2. Initialize a new TypeScript project:\n ```bash\n npm init -y \\\n && npm install typescript ts-node @types/node --save-dev \\\n && npx tsc --init\n ```\n\n3. Install the required dependencies:\n ```bash\n npm install @polkadot/api @acala-network/chopsticks\n ```\n\n4. Create a new TypeScript file for your script:\n ```bash\n touch test-proposal.ts\n ```\n\n !!!note\n You'll write your code to simulate and test OpenGov proposals in the `test-proposal.ts` file.\n\n5. Open the `tsconfig.json` file and ensure it includes these compiler options:\n ```json\n {\n \"compilerOptions\": {\n \"module\": \"CommonJS\",\n \"esModuleInterop\": true,\n \"target\": \"esnext\",\n \"moduleResolution\": \"node\",\n \"declaration\": true,\n \"sourceMap\": true,\n \"skipLibCheck\": true,\n \"outDir\": \"dist\",\n \"composite\": true\n }\n }\n\n ```"} @@ -1313,7 +1313,7 @@ {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 10, "depth": 2, "title": "Summary", "anchor": "summary", "start_char": 92075, "end_char": 92735, "estimated_token_count": 125, "token_estimator": "heuristic-v1", "text": "## Summary\n\nIn this tutorial, you've learned how to use Chopsticks to test OpenGov proposals on a local fork of the Polkadot network. You've set up a TypeScript project, connected to a local fork, submitted a proposal, and forced its execution for testing purposes. This process allows you to:\n\n- Safely experiment with different types of proposals.\n- Test the effects of proposals without affecting the live network.\n- Rapidly iterate and debug your governance ideas.\n\nUsing these techniques, you can develop and refine your proposals before submitting them to the Polkadot network, ensuring they're well-tested and likely to achieve their intended effects."} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 11, "depth": 2, "title": "Full Code", "anchor": "full-code", "start_char": 92735, "end_char": 169907, "estimated_token_count": 15583, "token_estimator": "heuristic-v1", "text": "## Full Code\n\nHere's the complete code for the `test-proposal.ts` file, incorporating all the steps we've covered:\n\n??? code \"`test-proposal.ts`\"\n ```typescript\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n ```"} {"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 405, "end_char": 455, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 455, "end_char": 883, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 455, "end_char": 872, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 30, "end_char": 866, "estimated_token_count": 192, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn previous tutorials, you learned how to [create a custom pallet](/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/){target=\\_blank} and [test it](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-unit-testing/){target=\\_blank}. The next step is to include this pallet in your runtime, integrating it into the core logic of your blockchain.\n\nThis tutorial will guide you through adding two pallets to your runtime: the custom pallet you previously developed and the [utility pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/index.html){target=\\_blank}. This standard Polkadot SDK pallet provides powerful dispatch functionality. The utility pallet offers, for example, batch dispatch, a stateless operation that enables executing multiple calls in a single transaction."} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 1, "depth": 2, "title": "Add the Pallets as Dependencies", "anchor": "add-the-pallets-as-dependencies", "start_char": 866, "end_char": 8510, "estimated_token_count": 1856, "token_estimator": "heuristic-v1", "text": "## Add the Pallets as Dependencies\n\nFirst, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps:\n\n1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line:\n\n ```toml hl_lines=\"4\" title=\"runtime/Cargo.toml\"\n [dependencies]\n ...\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n ...\n ], default-features = false }\n ```\n\n2. In the same `[dependencies]` section, add the custom pallet that you built from scratch with the following line:\n\n ```toml hl_lines=\"3\" title=\"Cargo.toml\"\n [dependencies]\n ...\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n ```\n\n3. In the `[features]` section, add the custom pallet to the `std` feature list:\n\n ```toml hl_lines=\"5\" title=\"Cargo.toml\"\n [features]\n default = [\"std\"]\n std = [\n ...\n \"custom-pallet/std\",\n ...\n ]\n ```\n\n3. Save the changes and close the `Cargo.toml` file.\n\n Once you have saved your file, it should look like the following:\n\n ???- code \"runtime/Cargo.toml\"\n \n ```rust title=\"runtime/Cargo.toml\"\n [package]\n name = \"parachain-template-runtime\"\n description = \"A parachain runtime template built with Substrate and Cumulus, part of Polkadot Sdk.\"\n version = \"0.1.0\"\n license = \"Unlicense\"\n authors.workspace = true\n homepage.workspace = true\n repository.workspace = true\n edition.workspace = true\n publish = false\n\n [package.metadata.docs.rs]\n targets = [\"x86_64-unknown-linux-gnu\"]\n\n [build-dependencies]\n docify = { workspace = true }\n substrate-wasm-builder = { optional = true, workspace = true, default-features = true }\n\n [dependencies]\n codec = { features = [\"derive\"], workspace = true }\n cumulus-pallet-parachain-system.workspace = true\n docify = { workspace = true }\n hex-literal = { optional = true, workspace = true, default-features = true }\n log = { workspace = true }\n pallet-parachain-template = { path = \"../pallets/template\", default-features = false }\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n \"cumulus-pallet-aura-ext\",\n \"cumulus-pallet-session-benchmarking\",\n \"cumulus-pallet-weight-reclaim\",\n \"cumulus-pallet-xcm\",\n \"cumulus-pallet-xcmp-queue\",\n \"cumulus-primitives-aura\",\n \"cumulus-primitives-core\",\n \"cumulus-primitives-utility\",\n \"pallet-aura\",\n \"pallet-authorship\",\n \"pallet-balances\",\n \"pallet-collator-selection\",\n \"pallet-message-queue\",\n \"pallet-session\",\n \"pallet-sudo\",\n \"pallet-timestamp\",\n \"pallet-transaction-payment\",\n \"pallet-transaction-payment-rpc-runtime-api\",\n \"pallet-xcm\",\n \"parachains-common\",\n \"polkadot-parachain-primitives\",\n \"polkadot-runtime-common\",\n \"runtime\",\n \"staging-parachain-info\",\n \"staging-xcm\",\n \"staging-xcm-builder\",\n \"staging-xcm-executor\",\n ], default-features = false }\n scale-info = { features = [\"derive\"], workspace = true }\n serde_json = { workspace = true, default-features = false, features = [\n \"alloc\",\n ] }\n smallvec = { workspace = true, default-features = true }\n\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n\n [features]\n default = [\"std\"]\n std = [\n \"codec/std\",\n \"cumulus-pallet-parachain-system/std\",\n \"log/std\",\n \"pallet-parachain-template/std\",\n \"polkadot-sdk/std\",\n \"scale-info/std\",\n \"serde_json/std\",\n \"substrate-wasm-builder\",\n \"custom-pallet/std\",\n ]\n\n runtime-benchmarks = [\n \"cumulus-pallet-parachain-system/runtime-benchmarks\",\n \"hex-literal\",\n \"pallet-parachain-template/runtime-benchmarks\",\n \"polkadot-sdk/runtime-benchmarks\",\n ]\n\n try-runtime = [\n \"cumulus-pallet-parachain-system/try-runtime\",\n \"pallet-parachain-template/try-runtime\",\n \"polkadot-sdk/try-runtime\",\n ]\n\n # Enable the metadata hash generation.\n #\n # This is hidden behind a feature because it increases the compile time.\n # The wasm binary needs to be compiled twice, once to fetch the metadata,\n # generate the metadata hash and then a second time with the\n # `RUNTIME_METADATA_HASH` environment variable set for the `CheckMetadataHash`\n # extension.\n metadata-hash = [\"substrate-wasm-builder/metadata-hash\"]\n\n # A convenience feature for enabling things when doing a build\n # for an on-chain release.\n on-chain-release-build = [\"metadata-hash\"]\n\n ```\n\nUpdate your root parachain template's `Cargo.toml` file to include your custom pallet as a dependency. Follow these steps:\n\n1. Open the `./Cargo.toml` file and locate the `[workspace]` section. \n \n Make sure the `custom-pallet` is a member of the workspace:\n\n ```toml hl_lines=\"4\" title=\"Cargo.toml\"\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n ```\n\n???- code \"./Cargo.toml\"\n\n ```rust title=\"./Cargo.toml\"\n [workspace.package]\n license = \"MIT-0\"\n authors = [\"Parity Technologies \"]\n homepage = \"https://paritytech.github.io/polkadot-sdk/\"\n repository = \"https://github.com/paritytech/polkadot-sdk-parachain-template.git\"\n edition = \"2021\"\n\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n resolver = \"2\"\n\n [workspace.dependencies]\n parachain-template-runtime = { path = \"./runtime\", default-features = false }\n pallet-parachain-template = { path = \"./pallets/template\", default-features = false }\n clap = { version = \"4.5.13\" }\n color-print = { version = \"0.3.4\" }\n docify = { version = \"0.2.9\" }\n futures = { version = \"0.3.31\" }\n jsonrpsee = { version = \"0.24.3\" }\n log = { version = \"0.4.22\", default-features = false }\n polkadot-sdk = { version = \"2503.0.1\", default-features = false }\n prometheus-endpoint = { version = \"0.17.2\", default-features = false, package = \"substrate-prometheus-endpoint\" }\n serde = { version = \"1.0.214\", default-features = false }\n codec = { version = \"3.7.4\", default-features = false, package = \"parity-scale-codec\" }\n cumulus-pallet-parachain-system = { version = \"0.20.0\", default-features = false }\n hex-literal = { version = \"0.4.1\", default-features = false }\n scale-info = { version = \"2.11.6\", default-features = false }\n serde_json = { version = \"1.0.132\", default-features = false }\n smallvec = { version = \"1.11.0\", default-features = false }\n substrate-wasm-builder = { version = \"26.0.1\", default-features = false }\n frame = { version = \"0.9.1\", default-features = false, package = \"polkadot-sdk-frame\" }\n\n [profile.release]\n opt-level = 3\n panic = \"unwind\"\n\n [profile.production]\n codegen-units = 1\n inherits = \"release\"\n lto = true\n ```"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 2, "depth": 3, "title": "Update the Runtime Configuration", "anchor": "update-the-runtime-configuration", "start_char": 8510, "end_char": 10415, "estimated_token_count": 406, "token_estimator": "heuristic-v1", "text": "### Update the Runtime Configuration\n\nConfigure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets:\n\n1. Add the `OriginCaller` import:\n\n ```rust title=\"mod.rs\" hl_lines=\"8\"\n // Local module imports\n use super::OriginCaller;\n ...\n ```\n\n2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file:\n\n ```rust title=\"mod.rs\" hl_lines=\"8-25\"\n ...\n /// Configure the pallet template in pallets/template.\n impl pallet_parachain_template::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n\n // Configure utility pallet.\n impl pallet_utility::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type RuntimeCall = RuntimeCall;\n type PalletsOrigin = OriginCaller;\n type WeightInfo = pallet_utility::weights::SubstrateWeight;\n }\n // Define counter max value runtime constant.\n parameter_types! {\n pub const CounterMaxValue: u32 = 500;\n }\n\n // Configure custom pallet.\n impl custom_pallet::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type CounterMaxValue = CounterMaxValue;\n }\n ```\n\n3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets:\n\n ```rust hl_lines=\"9-14\" title=\"lib.rs\"\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n ...\n )]\n pub struct Runtime;\n #[runtime::pallet_index(51)]\n pub type Utility = pallet_utility;\n\n #[runtime::pallet_index(52)]\n pub type CustomPallet = custom_pallet;\n }\n ```"} @@ -1433,7 +1433,7 @@ {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 0, "depth": 2, "title": "Benefits of Asset Hub", "anchor": "benefits-of-asset-hub", "start_char": 23, "end_char": 1017, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Benefits of Asset Hub\n\nPolkadot SDK-based relay chains focus on security and consensus, leaving asset management to an external component, such as a [system chain](/polkadot-protocol/architecture/system-chains/){target=\\_blank}. The [Asset Hub](/polkadot-protocol/architecture/system-chains/asset-hub/){target=\\_blank} is one example of a system chain and is vital to managing tokens which aren't native to the Polkadot ecosystem. Developers opting to integrate with Asset Hub can expect the following benefits:\n\n- **Support for non-native on-chain assets**: Create and manage your own tokens or NFTs with Polkadot ecosystem compatibility available out of the box.\n- **Lower transaction fees**: Approximately 1/10th of the cost of using the relay chain.\n- **Reduced deposit requirements**: Approximately 1/100th of the deposit required for the relay chain.\n- **Payment of fees with non-native assets**: No need to buy native tokens for gas, increasing flexibility for developers and users."} {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 1, "depth": 2, "title": "Get Started", "anchor": "get-started", "start_char": 1017, "end_char": 1303, "estimated_token_count": 48, "token_estimator": "heuristic-v1", "text": "## Get Started\n\nThrough these tutorials, you'll learn how to manage cross-chain assets, including:\n\n- Asset registration and configuration\n- Cross-chain asset representation\n- Liquidity pool creation and management \n- Asset swapping and conversion\n- Transaction parameter optimization"} {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1303, "end_char": 1353, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1353, "end_char": 1778, "estimated_token_count": 116, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1353, "end_char": 1767, "estimated_token_count": 113, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 0, "depth": 2, "title": "For Parachain Integrators", "anchor": "for-parachain-integrators", "start_char": 619, "end_char": 990, "estimated_token_count": 83, "token_estimator": "heuristic-v1", "text": "## For Parachain Integrators\n\nEnhance cross-chain interoperability and expand your parachain’s functionality:\n\n- **[Register your parachain's asset on Asset Hub](/tutorials/polkadot-sdk/system-chains/asset-hub/register-foreign-asset/)**: Connect your parachain’s assets to Asset Hub as a foreign asset using XCM, enabling seamless cross-chain transfers and integration."} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 1, "depth": 2, "title": "For Developers Leveraging System Chains", "anchor": "for-developers-leveraging-system-chains", "start_char": 990, "end_char": 1551, "estimated_token_count": 134, "token_estimator": "heuristic-v1", "text": "## For Developers Leveraging System Chains\n\nUnlock new possibilities by tapping into Polkadot’s system chains:\n\n- **[Register a new asset on Asset Hub](/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-asset/)**: Create and customize assets directly on Asset Hub (local assets) with parameters like metadata, minimum balances, and more.\n\n- **[Convert Assets](/tutorials/polkadot-sdk/system-chains/asset-hub/asset-conversion/)**: Use Asset Hub's AMM functionality to swap between different assets, provide liquidity to pools, and manage LP tokens."} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1551, "end_char": 1600, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} @@ -1459,7 +1459,7 @@ {"page_id": "tutorials-polkadot-sdk-testing", "page_title": "Blockchain Testing Tutorials", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 794, "end_char": 843, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 0, "depth": 2, "title": "Build and Deploy a Parachain", "anchor": "build-and-deploy-a-parachain", "start_char": 450, "end_char": 1038, "estimated_token_count": 133, "token_estimator": "heuristic-v1", "text": "## Build and Deploy a Parachain\n\nFollow these key milestones to guide you through parachain development. Each step links to detailed tutorials for a deeper dive into each stage:\n\n- **[Install the Polkadot SDK](/develop/parachains/install-polkadot-sdk/)**: Set up the necessary tools to begin building on Polkadot. This step will get your environment ready for parachain development.\n\n- **[Parachains Zero to Hero](/tutorials/polkadot-sdk/parachains/zero-to-hero/)**: A series of step-by-step guides to building, testing, and deploying custom pallets and runtimes using the Polkadot SDK."} {"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1038, "end_char": 1088, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1088, "end_char": 1489, "estimated_token_count": 113, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1088, "end_char": 1478, "estimated_token_count": 110, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 191, "end_char": 857, "estimated_token_count": 131, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nDecentralized exchanges (DEXs) are a cornerstone of the DeFi ecosystem, allowing for permissionless token swaps without intermediaries. [Uniswap V2](https://docs.uniswap.org/contracts/v2/overview){target=\\_blank}, with its Automated Market Maker (AMM) model, revolutionized DEXs by enabling liquidity provision for any ERC-20 token pair.\n\nThis tutorial will guide you through how Uniswap V2 works so you can take advantage of it in your projects deployed to Polkadot Hub. By understanding these contracts, you'll gain hands-on experience with one of the most influential DeFi protocols and understand how it functions across blockchain ecosystems."} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 857, "end_char": 1352, "estimated_token_count": 121, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- Node.js (v16.0.0 or later) and npm installed.\n- Basic understanding of Solidity and JavaScript.\n- Familiarity with [`hardhat-polkadot`](/develop/smart-contracts/dev-environments/hardhat){target=\\_blank} development environment.\n- Some PAS test tokens to cover transaction fees (obtained from the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\\_blank}).\n- Basic understanding of how AMMs and liquidity pools work."} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 1352, "end_char": 3690, "estimated_token_count": 572, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nLet's start by cloning the Uniswap V2 project:\n\n1. Clone the Uniswap V2 repository:\n\n ```\n git clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.6\n cd polkavm-hardhat-examples/uniswap-v2-polkadot/\n ```\n\n2. Install the required dependencies:\n\n ```bash\n npm install\n ```\n\n3. Update the `hardhat.config.js` file so the paths for the Substrate node and the ETH-RPC adapter match with the paths on your machine. For more info, check the [Testing your Contract](/develop/smart-contracts/dev-environments/hardhat/#testing-your-contract){target=\\_blank} section in the Hardhat guide.\n\n ```js title=\"hardhat.config.js\"\n hardhat: {\n polkavm: true,\n nodeConfig: {\n nodeBinaryPath: '../bin/substrate-node',\n rpcPort: 8000,\n dev: true,\n },\n adapterConfig: {\n adapterBinaryPath: '../bin/eth-rpc',\n dev: true,\n },\n },\n ```\n\n4. Create a `.env` file in your project root to store your private keys (you can use as an example the `env.example` file):\n\n ```text title=\".env\"\n LOCAL_PRIV_KEY=\"INSERT_LOCAL_PRIVATE_KEY\"\n AH_PRIV_KEY=\"INSERT_AH_PRIVATE_KEY\"\n ```\n\n Ensure to replace `\"INSERT_LOCAL_PRIVATE_KEY\"` with a private key available in the local environment (you can get them from this [file](https://github.com/paritytech/hardhat-polkadot/blob/main/packages/hardhat-polkadot-node/src/constants.ts#L22){target=\\_blank}). And `\"INSERT_AH_PRIVATE_KEY\"` with the account's private key you want to use to deploy the contracts. You can get this by exporting the private key from your wallet (e.g., MetaMask).\n\n !!!warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen.\n\n5. Compile the contracts:\n\n ```bash\n npx hardhat compile\n ```\n\nIf the compilation is successful, you should see the following output:\n\n
\n npx hardhat compile\n Compiling 12 Solidity files\n Successfully compiled 12 Solidity files\n
\n\nAfter running the above command, you should see the compiled contracts in the `artifacts-pvm` directory. This directory contains the ABI and bytecode of your contracts."} @@ -1522,6 +1522,6 @@ {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 1, "depth": 2, "title": "Start Building", "anchor": "start-building", "start_char": 739, "end_char": 1008, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Start Building\n\nJump into the tutorials and learn how to:\n\n- Write and compile smart contracts.\n- Deploy contracts to the Polkadot network.\n- Interact with deployed contracts using libraries like Ethers.js and viem.\n\nChoose a tutorial below and start coding today!"} {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1008, "end_char": 1057, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials", "page_title": "Tutorials", "index": 0, "depth": 2, "title": "Polkadot Zero to Hero", "anchor": "polkadot-zero-to-hero", "start_char": 326, "end_char": 452, "estimated_token_count": 25, "token_estimator": "heuristic-v1", "text": "## Polkadot Zero to Hero\n\nThe Zero to Hero series offers step-by-step guidance to development across the Polkadot ecosystem."} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 1, "depth": 3, "title": "Parachain Developers", "anchor": "parachain-developers", "start_char": 452, "end_char": 948, "estimated_token_count": 135, "token_estimator": "heuristic-v1", "text": "### Parachain Developers\n\n"} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 2, "depth": 2, "title": "Featured Tutorials", "anchor": "featured-tutorials", "start_char": 948, "end_char": 2449, "estimated_token_count": 422, "token_estimator": "heuristic-v1", "text": "## Featured Tutorials\n\n"} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 3, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2449, "end_char": 2498, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 1, "depth": 3, "title": "Parachain Developers", "anchor": "parachain-developers", "start_char": 452, "end_char": 937, "estimated_token_count": 132, "token_estimator": "heuristic-v1", "text": "### Parachain Developers\n\n"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 2, "depth": 2, "title": "Featured Tutorials", "anchor": "featured-tutorials", "start_char": 937, "end_char": 2394, "estimated_token_count": 410, "token_estimator": "heuristic-v1", "text": "## Featured Tutorials\n\n"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 3, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2394, "end_char": 2443, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} diff --git a/tutorials/dapps/index.md b/tutorials/dapps/index.md index a67b8a74e..0e425bdc5 100644 --- a/tutorials/dapps/index.md +++ b/tutorials/dapps/index.md @@ -20,14 +20,12 @@ You'll explore a range of topics—from client-side apps and CLI tools to on-cha diff --git a/tutorials/index.md b/tutorials/index.md index 86de48570..96e58d0ea 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -20,7 +20,6 @@ The Zero to Hero series offers step-by-step guidance to development across the P @@ -32,28 +31,24 @@ The Zero to Hero series offers step-by-step guidance to development across the P diff --git a/tutorials/interoperability/index.md b/tutorials/interoperability/index.md index 83c3e4550..4d27d61e1 100644 --- a/tutorials/interoperability/index.md +++ b/tutorials/interoperability/index.md @@ -31,14 +31,12 @@ Learn to establish and use cross-chain communication channels: diff --git a/tutorials/interoperability/xcm-channels/index.md b/tutorials/interoperability/xcm-channels/index.md index a927df0b6..47253b6e9 100644 --- a/tutorials/interoperability/xcm-channels/index.md +++ b/tutorials/interoperability/xcm-channels/index.md @@ -26,7 +26,6 @@ To enable communication between parachains, explicit HRMP channels must be estab diff --git a/tutorials/onchain-governance/index.md b/tutorials/onchain-governance/index.md index fb270890d..2d68e137f 100644 --- a/tutorials/onchain-governance/index.md +++ b/tutorials/onchain-governance/index.md @@ -20,7 +20,6 @@ This section provides step-by-step tutorials to help you navigate the technical diff --git a/tutorials/polkadot-sdk/index.md b/tutorials/polkadot-sdk/index.md index 633105c3c..d9f4ee471 100644 --- a/tutorials/polkadot-sdk/index.md +++ b/tutorials/polkadot-sdk/index.md @@ -28,7 +28,6 @@ Follow these key milestones to guide you through parachain development. Each ste diff --git a/tutorials/polkadot-sdk/system-chains/asset-hub/index.md b/tutorials/polkadot-sdk/system-chains/asset-hub/index.md index 692ed8e8a..fc6a91236 100644 --- a/tutorials/polkadot-sdk/system-chains/asset-hub/index.md +++ b/tutorials/polkadot-sdk/system-chains/asset-hub/index.md @@ -35,7 +35,6 @@ Through these tutorials, you'll learn how to manage cross-chain assets, includin