This guide covers the fundamentals of working with Gradle in this project. Understanding these concepts will help you navigate the build system and contribute effectively.
Core Concepts:
- Gradle Files — Groovy vs Kotlin DSL
- Build Lifecycle — Initialization → Configuration → Execution phases
- Tasks — Lifecycle tasks, inputs/outputs, lazy configuration
- Configurations — Understadning their relations and their role;
apivsimplementation - Dependencies — Use single GAV strings, exclusions, resolution strategies, locking
- Convention Plugins — Project and settings plugins in our build logic
buildSrc/ - Lazy API — Use
named(),register(),configureEach()for performance
dd-trace-java Specifics:
- Custom Extensions —
testJvmConstraints,tracerJava, CI slots, git change tracking - Script Plugins — Standalone
.gradlefiles (deprecated, ongoing effort migrate those to convention plugins) - Our Daemon JVM — Configuring the JVM that runs Gradle
Troubleshooting:
- Build Scans & Diagnostics — Develocity,
--info,--scan, critical path analysis - Configuration Cache — Common violations and fixes
Tip
First time with Gradle? Start with Gradle Files, Build Lifecycle, and Tasks.
Gradle builds are defined through a set of build scripts. These scripts can be written in two Domain Specific Languages (DSLs): Groovy DSL and Kotlin DSL.
The original Gradle DSL uses Groovy syntax. Files use the .gradle extension.
Avoid Groovy DSL if you can, small project, however since the project still rely heaviliy on script plugins written in Groovy, this is not always straighforward to use Kotlin DSL.
plugins {
id 'java'
}
dependencies {
implementation 'com.google.guava:guava:32.1.2-jre'
}
tasks.register('hello') {
doLast {
println 'Hello from Groovy DSL'
}
}Note
Ideally, prefer the Kotlin DSL approach as it has better IDE support. However, due to the script plugins this is not always possible in an easy way.
The Kotlin DSL offers type-safety, better IDE support, and compile-time checking. Files use the .gradle.kts extension.
plugins {
id("java")
}
dependencies {
implementation("com.google.guava:guava:32.1.2-jre")
}
tasks.register("hello") {
doLast {
println("Hello from Kotlin DSL")
}
}Key differences at a glance:
| Aspect | Groovy DSL | Kotlin DSL |
|---|---|---|
| File extension | .gradle |
.gradle.kts |
| String quotes | Single ' or double " |
Double " only |
| Method calls | Parentheses optional | Parentheses required |
| Property assignment | = optional |
= required (mostly) |
| IDE support | Limited | Full auto-completion and refactoring |
| Type safety | Dynamic typing | Static typing with compile-time checks |
Gradle executes builds in distinct phases. Understanding this lifecycle is essential for writing correct and efficient build logic.
Gradle determines which projects are part of the build. It executes:
init.gradle(or scripts in~/.gradle/init.d/): Global initialization scripts that run before any project is evaluatedsettings.gradle.kts: Defines the main repository project structure and discovers subprojects
// settings.gradle.kts
rootProject.name = "my-project"
include("module-a")
include("module-b")
include("module-c:submodule")Gradle evaluates all build scripts of the participating projects. During this phase:
- Build scripts (
build.gradle.kts) are executed - Tasks are registered and configured
- The task graph is constructed based on dependencies
Note
Code in the configuration phase runs on every build invocation, even if the requested task doesn't need it. Keep configuration-time logic fast and avoid I/O operations.
// build.gradle.kts
plugins {
id("java")
}
// This runs during CONFIGURATION - avoid expensive operations in this phase
val expensiveValue = file("some-file.txt").readText() // Bad!
tasks.register("myTask") {
// Task configuration also runs during configuration phase
// But the task ACTION (doLast/doFirst) runs during execution
doLast {
// This runs during EXECUTION phase
println("Executing myTask")
}
}Gradle executes the selected tasks in dependency order. Only tasks required to complete the requested goal are executed.
./gradlew build
> Task :compileJava
> Task :processResources
> Task :classes
> Task :jar
> Task :assemble
> Task :compileTestJava
> Task :testClasses
> Task :test
> Task :check
> Task :build
In a well-organized Gradle project, build logic lives in specific places:
| Location | Purpose |
|---|---|
settings.gradle.kts |
Project structure, repository settings, plugin management |
build.gradle.kts |
Project-specific build configuration |
buildSrc/ |
Build logic automatically included by Gradle; contains convention plugins and shared configuration. It's possible to use different location(s) but it requires explicit declaration(s). |
gradle/ |
Version catalogs and wrapper files and script plugins |
Caution
Script plugins are not recommended. The best practice for developing our build logic in plugins is to create convention plugins or binary plugins.
During the Configuration phase, Gradle doesn't simply execute build scripts top-to-bottom. Instead, it first extracts and processes certain special blocks before compiling the rest of the script. This is necessary because Gradle needs to know which plugins to apply before it can understand the DSL extensions they provide.
Processing order for settings.gradle.kts (Initialization phase):
pluginManagement {}— Configures plugin repositories and version resolution. If present it must be the first block.plugins {}— Declares the settings plugins to apply.- Script body — Project includes, build configuration, etc.
Processing order for build.gradle.kts (Configuration phase):
buildscript {}— Declares dependencies for the build script itself (the script's classpath). It should be avoided now with theplugins {}block. If present it must be the first block.plugins {}— Declares plugins to apply. Gradle extracts this block first to load plugin classes before compiling the rest.- Script body — The rest of the script is compiled and executed, now with access to DSL extensions from applied plugins.
A task usually represents an independent unit of work, however there are also lifecycle tasks. A lifecycle task is a task that doesn't perform work itself but aggregates other tasks. It provides convenient entry points for common build operations.
The base plugin applies the lifecycle-base plugin (org.gradle.language.base.plugins.LifecycleBasePlugin), which defines these standard lifecycle tasks:
| Task | Purpose |
|---|---|
clean |
Deletes the build directory |
assemble |
Assembles all outputs (e.g., JARs) without running tests |
check |
Runs all verification tasks (tests, linting, etc.) |
verification |
Base task for all verification tasks (check depends on it) |
build |
Performs a full build (assemble + check) |
Tip
Use ./gradlew tasks to list available tasks, or ./gradlew tasks --all to include tasks from all subprojects.
Gradle tasks declare inputs (what they read) and outputs (what they produce). This metadata enables two key optimizations:
- Incremental builds: If inputs haven't changed, the task is
UP-TO-DATEand skipped - Build caching: Outputs can be stored and retrieved (
FROM-CACHE) across builds
In custom tasks, use annotations to declare inputs and outputs:
| Annotation | Purpose |
|---|---|
@Input |
A simple value (String, Boolean, etc.) |
@InputFile |
A single input file |
@InputFiles |
Multiple input files |
@InputDirectory |
An input directory |
@OutputFile |
A single output file |
@OutputDirectory |
An output directory |
@Internal |
Excluded from up-to-date checks |
@Nested |
A nested object with its own input/output annotations |
More annotations are documented here.
For ad-hoc tasks or when you can't use annotations, declare inputs and outputs programmatically via the inputs and outputs properties:
tasks.register("processData") {
// Declare inputs
inputs.property("version", project.version)
inputs.file("config.json")
inputs.files(fileTree("src/data"))
inputs.dir("templates")
// Declare outputs
outputs.file(project.layout.buildDirectory.file("output.txt"))
outputs.dir(project.layout.buildDirectory.dir("generated"))
// Cacheability (required for build cache)
outputs.cacheIf { true }
doLast {
// Task action
}
}| Method | Purpose |
|---|---|
inputs.property(name, value) |
A named input value |
inputs.file(path) |
A single input file |
inputs.files(paths) |
Multiple input files |
inputs.dir(path) |
An input directory |
destroyables.register(paths) |
Paths that will be deleted |
outputs.file(path) |
A single output file |
outputs.dir(path) |
An output directory |
outputs.cacheIf { } |
Enable build caching conditionally |
Note
Prefer annotations in custom task classes for better type safety and documentation.
Use the programmatic API for ad-hoc tasks registered with tasks.register.
Gradle uses lazy configuration to defer value resolution until needed. Instead of setting values directly, you use Property and Provider types:
abstract class MyTask : DefaultTask() {
@get:Input
abstract val message: Property<String> // Lazy, mutable
@get:InputFile
abstract val inputFile: RegularFileProperty
@get:OutputDirectory
abstract val outputDir: DirectoryProperty
}Common property types:
Property<T>— single valueListProperty<T>— ordered collectionSetProperty<T>— unique valuesMapProperty<K, V>— key-value pairsRegularFileProperty/DirectoryProperty— file system locations
Note
Lazy properties avoid configuration-time overhead and ensure values are resolved in the correct order during the build.
When running Gradle tasks, you'll see status labels indicating what happened during execution, for example with --console=verbose:
| Label | Description |
|---|---|
| (no label) | The task ran and executed its actions |
UP-TO-DATE |
The task's outputs are current; no work needed |
FROM-CACHE |
Outputs were retrieved from the build cache |
SKIPPED |
The task was excluded (e.g., via -x or onlyIf condition) |
NO-SOURCE |
The task had no input files to process |
-
Prefer
SyncoverCopy:Syncmirrors the source to the destination by removing files that no longer exist in the source.Copyleaves stale files behind, which can cause subtle bugs with old configs, renamed classes, or deleted resources. -
Prefer
Deletetask overdelete(): TheDeletetask type properly declares what it destroys, allowing Gradle to order tasks correctly, preventing deleting files that other tasks need.// ❌ Ad-hoc deletion tasks.register("cleanGenerated") { doLast { delete(layout.buildDirectory.dir("generated")) // Runs immediately when task executes } } // ✅ Proper Delete task tasks.register<Delete>("cleanGenerated") { delete(layout.buildDirectory.dir("generated")) }
Configurations are a fundamental concept in Gradle's dependency management system. Understanding them is essential for working effectively with the build.
A configuration is a named collection of dependencies that serves a specific purpose in the build. Think of configurations as labeled buckets where you place dependencies based on how they should be used.
dependencies {
// "implementation" is a configuration
implementation("com.google.guava:guava:32.1.2-jre")
// "testImplementation" is another configuration
testImplementation("org.junit.jupiter:junit-jupiter:5.10.0")
}Configurations serve two main purposes:
- Declaring dependencies: You add dependencies to configurations to express what your project needs
- Resolving dependencies: Gradle uses configurations to compute the full dependency graph (including transitives)
Tip
The full flow:
- You declare dependencies in declarable configurations
- Gradle resolves them into resolvable classpaths
- Two usual consumers:
- Source set (via tasks) uses resolvable classpaths to compile/run
- Other projects consume via consumable
apiElements/runtimeElements
Configurations have three key attributes that define their role:
| Attribute | Description |
|---|---|
Declarable (canBeDeclared) |
Can have dependencies added to it directly in the dependencies {} block |
Resolvable (canBeResolved) |
Can be resolved to produce a set of files (e.g., for compilation or runtime) |
Consumable (canBeConsumed) |
Can be consumed by other projects as a dependency |
Important
These three roles are mutually exclusive. Each configuration should have exactly one role enabled. A single configuration should not attempt to declare, resolve, and expose simultaneously.
Most configurations you interact with are declarable only — they're buckets where you put dependencies. Gradle then creates internal configurations that are resolvable (to get the actual files) or consumable (to expose artifacts to dependent projects).
Note
You rarely interact with resolvable or consumable configurations directly. Gradle creates and manages them automatically when you apply plugins like java or java-library.
When you apply the java-library plugin, Gradle creates a comprehensive set of configurations. The following diagram shows the relationships between them (based on the official Gradle documentation):
---
config:
flowchart:
curve: bundle
htmlLabels: true
---
graph LR
subgraph Declarable["bucket configurations (<b>declarable</b>)"]
api[api]
compileOnly[compileOnly]
implementation[implementation]
runtimeOnly[runtimeOnly]
end
subgraph Resolvable["classpaths (<b>resolvable</b>)"]
compileClasspath[compileClasspath]
runtimeClasspath[runtimeClasspath]
end
subgraph SourceSet["sourceSet"]
main([main])
end
subgraph Consumable["elements (<b>consumable</b>)"]
apiElements[apiElements]
runtimeElements[runtimeElements]
end
%% API flow - exposed to consumers
api --> apiElements
api --> implementation
%% Compile-time flow
compileOnly --> compileClasspath
%% Implementation flow - internal only
implementation --> compileClasspath
implementation --> runtimeClasspath
implementation --> runtimeElements
%% Runtime-only flow
runtimeOnly --> runtimeClasspath
runtimeOnly --> runtimeElements
%% Source set uses resolvable configurations
compileClasspath --> main
runtimeClasspath --> main
%% Node styling
style api fill:#90caf9,stroke:#1976d2
style compileOnly fill:#90caf9,stroke:#1976d2
style implementation fill:#90caf9,stroke:#1976d2
style runtimeOnly fill:#90caf9,stroke:#1976d2
style compileClasspath fill:#a5d6a7,stroke:#388e3c
style runtimeClasspath fill:#a5d6a7,stroke:#388e3c
style main fill:#e0e0e0,stroke:#757575
style apiElements fill:#fff9c4,stroke:#f9a825
style runtimeElements fill:#fff9c4,stroke:#f9a825
%% Edge styling - long dashes for consumable (indices: 0=api->apiElements, 5=impl->runtimeElements, 7=runtimeOnly->runtimeElements)
linkStyle 0 stroke:#9e9e9e,stroke-dasharray:10 5
linkStyle 5 stroke:#9e9e9e,stroke-dasharray:10 5
linkStyle 7 stroke:#9e9e9e,stroke-dasharray:10 5
Legend:
| Block Color | Role | Description |
|---|---|---|
| 🔵 Blue | Declarable | Where you add dependencies (api, implementation, compileOnly, runtimeOnly) |
| 🟢 Green | Resolvable | Used by tasks to get files (compileClasspath, runtimeClasspath) |
| 🟡 Yellow | Consumable | Exposed to consumer projects (apiElements, runtimeElements) |
| ⬜ Gray | Tasks | Gradle tasks that use the configurations |
| Configuration | Compile Classpath | Runtime Classpath | Exposed to Consumers | Use Case |
|---|---|---|---|---|
api |
✅ | ✅ | ✅ | Types in your public API (method signatures, return types) |
implementation |
✅ | ✅ | ❌ | Internal dependencies not exposed to consumers |
compileOnly |
✅ | ❌ | ❌ | Provided at runtime by the environment (e.g., servlet-api) |
compileOnlyApi |
✅ | ❌ | ✅ | Compile-only dependency that's part of the public API |
runtimeOnly |
❌ | ✅ | ❌ | Needed only at runtime (e.g., JDBC drivers, logging backends) |
Note
compileOnlyApi flows to apiElements (so consumers see it at compile time), i.e. the existing apiElements
handles both api and compileOnlyApi exposure.
The key difference between api and implementation is transitive exposure to consumers:
graph LR
subgraph "Library Project"
api_dep[api: Guava]
impl_dep[implementation: OkHttp]
end
subgraph "Consuming Project"
consumer[depends on Library]
compile[compileClasspath]
runtime[runtimeClasspath]
end
api_dep -->|"exposed"| compile
api_dep -->|"exposed"| runtime
impl_dep -.->|"NOT on compile classpath"| compile
impl_dep -->|"on runtime classpath"| runtime
style api_dep fill:#c8e6c9
style impl_dep fill:#ffecb3
style compile fill:#e3f2fd
style runtime fill:#e3f2fd
| Configuration | Transitive to Consumers' Compile | Transitive to Consumers' Runtime | When to Use |
|---|---|---|---|
api |
✅ | ✅ | Types appear in project's public API |
implementation |
❌ | ✅ | Project's internal implementation detail, not part of public API |
Example:
// java-library plugin required for 'api'
plugins {
`java-library`
}
dependencies {
// Guava types appear in public method signatures → use api
api("com.google.guava:guava:32.1.2-jre")
// OkHttp is used internally, not exposed → use implementation
implementation("com.squareup.okhttp3:okhttp:4.12.0")
}public class MyService {
// Guava's ImmutableList is in the public API → needs 'api'
public ImmutableList<String> getItems() {
return ImmutableList.of("a", "b");
}
// OkHttpClient is private, only used internally → 'implementation' is fine
private final OkHttpClient client = new OkHttpClient();
}Tip
Prefer implementation over api when dependencies are internal. Using implementation keeps dependencies
hidden from other projects, which:
- Reduces consumers' compile classpath (faster compilation)
- Allows changing dependencies without breaking consumers
- Avoids version conflicts in dependent projects
Configurations can inherit dependencies from other configurations using extendsFrom. When configuration A extends configuration B, A automatically includes all dependencies from B.
Built-in inheritance: The Java plugin automatically sets up inheritance so test code can use production dependencies:
graph LR
subgraph Main
implementation
end
subgraph Test
testImplementation
end
subgraph Test Fixtures
testFixturesImplementation
end
implementation --> testImplementation
implementation --> testFixturesImplementation
testFixturesImplementation -.->|"testFixtures(project)"| testImplementation
Custom configurations: Use extendsFrom when creating your own configurations that should inherit from existing ones:
// Create a declarable configuration for integration tests
val integrationTestImplementation by configurations.dependencyScope("integrationTestImplementation") {
extendsFrom(configurations.named("testImplementation").get())
}
dependencies {
// Gets all testImplementation deps automatically via inheritance
integrationTestImplementation("org.testcontainers:testcontainers:1.19.0")
}Each test suite (like test, integrationTest) gets its own set of configurations that mirror the main ones.
In this project, the gradle/test-suites.gradle script provides helpers to create test suites with proper configuration inheritance:
| Helper | Description |
|---|---|
addTestSuite('name') |
Creates name test suite extending test, sources in src/name/ |
addTestSuiteForDir('name', 'dir') |
Creates name test suite extending test, sources in src/dir/ |
addTestSuiteExtendingForDir('name', 'parent', 'dir') |
Creates name test suite extending parent test suite, sources in src/dir/ |
For example:
// Creates 'latestDepTest' suite extending 'test', sources in src/latestDepTest/
addTestSuite('latestDepTest')
// Creates 'latestDepForkedTest' suite extending 'latestDepTest', sources in src/latestDepTest/
addTestSuiteExtendingForDir('latestDepForkedTest', 'latestDepTest', 'latestDepTest')graph LR
subgraph Main
implementation
end
subgraph test
testImplementation
end
subgraph latestDepTest
latestDepTestImplementation
end
subgraph latestDepForkedTest
latestDepForkedTestImplementation
end
implementation --> testImplementation --> latestDepTestImplementation --> latestDepForkedTestImplementation
Similar inheritance applies to compileOnly, runtimeOnly, and annotationProcessor configurations.
You can create configurations for special purposes. Use the factory methods to explicitly document intent and set the appropriate flags automatically:
// A resolvable configuration for tools that generate code
// Use resolvable() - we want to resolve it to get files, not expose to others
val codeGenerator by configurations.resolvable("codeGenerator")
dependencies {
codeGenerator("com.example:my-generator:1.0")
}
// Use the resolved files in a task
tasks.register("generateCode") {
val generatorClasspath = configurations.named("codeGenerator")
inputs.files(generatorClasspath)
doLast {
// generatorClasspath.get().files contains the resolved JARs
}
}Factory methods for custom configurations:
| Factory Method | Role | Use Case |
|---|---|---|
resolvable(...) |
Resolvable | Resolve to get files (e.g., tool classpaths, code generators) |
consumable(...) |
Consumable | Expose artifacts to other projects |
dependencyScope(...) |
Declarable | Bucket for declaring dependencies |
Tip
While it's not always possible, prefer factory methods over manually setting
isCanBeResolved/isCanBeConsumed flags. They make the configuration's purpose
explicit and prevent accidental misconfiguration.
To see all configurations and their relationships:
# List all configurations
./gradlew :my-project:dependencies
# Show a specific configuration's dependency tree
./gradlew :my-project:dependencies --configuration runtimeClasspath
# Show all resolvable configurations
./gradlew :my-project:resolvableConfigurationsNow that you understand configurations, let's look at how to declare and manage dependencies effectively.
Dependencies are declared in the dependencies {} block, specifying both the configuration and the dependency coordinates.
Always prefer the single GAV (Group:Artifact:Version) string notation over the map-based syntax:
dependencies {
// ❌ Avoid - map-based notation (verbose, error-prone)
implementation(group = "net.bytebuddy", name = "byte-buddy", version = "1.18.3")
// ✅ Preferred - single GAV string
implementation("net.bytebuddy:byte-buddy:1.18.3")
testImplementation("org.junit.jupiter:junit-jupiter:5.14.1")
}It's easier to read and write, it's the standard way to communicate coordinates (and it's also the recommended way of Maven Central). Finally, IDEs and linters work better with string literals
Tip
The single GAV string is the official Gradle best practice.
For consistency across modules, use version catalogs defined in gradle/libs.versions.toml:
Important
Instrumentation modules are an exception: they declare library-specific versions directly in their build files rather than using the version catalog. This is because instrumentation modules need to test against the exact versions they instrument.
# gradle/libs.versions.toml
[versions]
byte-buddy = "1.18.3"
slf4j = "1.7.30"
junit5 = "5.14.1"
[libraries]
bytebuddy = { module = "net.bytebuddy:byte-buddy", version.ref = "byte-buddy" }
slf4j = { module = "org.slf4j:slf4j-api", version.ref = "slf4j" }
junit-jupiter = { module = "org.junit.jupiter:junit-jupiter", version.ref = "junit5" }dependencies {
implementation(libs.bytebuddy)
implementation(libs.slf4j)
testImplementation(libs.junit.jupiter)
}Sometimes you need to exclude transitive dependencies to avoid version conflicts or unwanted libraries.
Exclude a transitive dependency from a single declaration using exclude:
dependencies {
// Exclude specific transitive dependency
implementation("com.example:library:1.0") {
exclude(group = "org.slf4j", module = "slf4j-api")
}
// Exclude all modules from a group
implementation("com.example:another-library:2.0") {
exclude(group = "commons-logging")
}
}Exclude a transitive dependency from all dependencies in a configuration:
configurations.named("implementation") {
exclude(group = "org.slf4j", module = "slf4j-log4j12")
}
// Exclude from test configurations
configurations.named("testImplementation") {
exclude(group = "junit", module = "junit") // Exclude JUnit 4 when using JUnit 5
}Exclude a dependency from the entire project across all configurations (use with caution):
configurations.configureEach {
exclude(group = "commons-logging", module = "commons-logging")
}Caution
Global exclusions affect all configurations, including those you might not expect (build scripts, plugins, tooling). This can break unexpected things:
- Build plugins that depend on the excluded library
- Tooling configurations (e.g., code generation, static analysis)
- Transitive resolution in unrelated modules
Prefer configuration-specific or dependency-specific exclusions unless you have a clear understanding of the impact.
To verify exclusions worked as expected:
# See the resolved dependency tree
./gradlew :dd-java-agent:testing:dependencies --configuration api
# Look for lines showing exclusions applied:
# +--- org.yaml:snakeyaml:2.0
# | \--- org.snakeyaml:snakeyaml-engine:2.6 -> excluded (via configuration api)Example: Replace a dependency globally
configurations.configureEach {
resolutionStrategy.eachDependency {
// Replace log4j with reload4j (security fix)
if (requested.group == "log4j" && requested.name == "log4j") {
useTarget("ch.qos.reload4j:reload4j:${requested.version}")
because("log4j has critical vulnerabilities")
}
}
}For multi-module projects where you want to suggest versions without directly adding dependencies:
dependencies {
// Define constraints - these don't add dependencies themselves
constraints {
implementation("org.slf4j:slf4j-api:2.0.0")
}
// When another module pulls in slf4j-api, it will use 2.0.0
implementation("com.example:library-that-uses-slf4j:1.0")
}Constraints are useful for:
- Aligning transitive dependency versions across modules
- Platform/BOM definitions
- Enforcing security patches without modifying every module
Tip
Follow the official Gradle dependency management best practices for robust builds:
- Use single GAV strings for dependency declarations
- Prefer version catalogs for multi-module projects to centralize versions
- Avoid dynamic versions (
1.+,latest.release) in production builds—they're non-reproducible, they are ok in instrumentation tests though - Use dependency constraints rather than forcing versions when possible
- Exclude dependencies at the narrowest scope (specific dependency > configuration > global)
- Document why you're forcing versions or excluding dependencies (use
because()) - Verify dependency trees with
./gradlew dependenciesafter making changes
For general Gradle best practices beyond dependencies, see the official best practices guide.
Dependency locking ensures reproducible builds by pinning exact versions of all transitive dependencies. This project uses Gradle's built-in dependency locking to:
- Enable reproducible builds: Rebuild any version with identical dependencies
- Prevent unexpected updates: Floating versions (
[16.0,20.0],1.+) are resolved once and locked - Improve IDE performance: IDEs don't re-index on every library release
- Track dependency changes: Lock file diffs show exactly what changed
All projects have dependency locking enabled via the dd-trace-java.dependency-locking convention plugin. Lock files are stored as gradle.lockfile in each project directory.
Lock files are automatically updated weekly by CI. To update them manually:
# Update all lock files in the repository
./gradlew resolveAndLockAll --write-locks
# Update lock files for a specific project
./gradlew :dd-trace-api:dependencies --write-locksImportant
This project uses lenient lock mode (LockMode.LENIENT), which allows the build to succeed
even if locked dependencies can't be resolved. This prevents build failures when dependencies
are temporarily unavailable or when resolution conflicts occur.
Lock files apply whenever dependencies are resolved:
- During compilation (
compileJava,compileTestJava) - During test execution (
test,latestDepTest) - When generating classpaths for IDE sync
- When resolving configurations manually (
./gradlew dependencies)
If a locked version can't be satisfied (e.g., due to a constraint or exclusion), lenient mode allows resolution to continue with the best available version.
# See locked dependencies for a project
cat dd-trace-api/gradle.lockfile
# View resolved dependencies (shows which versions are actually used)
./gradlew :dd-trace-api:dependencies --configuration runtimeClasspath- Developer adds a dependency: The dependency is resolved using the current version rules
- Run with
--write-locks: Gradle resolves all configurations and writes exact versions to lock files - Commit lock files: Include
gradle.lockfilechanges in your commit - CI validates: Builds use locked versions for reproducibility
- Weekly CI job: Automatically updates all lock files to pick up new versions
[!NOTE]> Instrumentation modules often use version ranges (e.g.,
[3.0,3.12.12]) to test against multiple library versions. Dependency locking pins these ranges to specific versions in lock files, ensuring consistent test behavior.
This project provides several custom Gradle extensions to manage multi-JVM testing, multi-version source sets, and CI optimizations.
Controls which JVM versions are allowed to run tests. Applied via the dd-trace-java.test-jvm-constraints plugin.
plugins {
id("dd-trace-java.test-jvm-constraints")
}
// project-wide constraints (apply to all Test tasks by default)
testJvmConstraints {
minJavaVersion = JavaVersion.VERSION_11
maxJavaVersion = JavaVersion.VERSION_21
excludeJdk.add("IBM8")
allowReflectiveAccessToJdk = true
}
// task-specific constraints (override project defaults for this task)
tasks.named<Test>("latestDepTest") {
testJvmConstraints {
minJavaVersion = JavaVersion.VERSION_17 // requires Java 17+ for this test suite
}
}| Property | Description |
|---|---|
minJavaVersion |
Minimum JDK version allowed for tests |
maxJavaVersion |
Maximum JDK version allowed for tests |
forceJdk |
List of JDK names to force (overrides version checks) |
includeJdk |
JDK names to include |
excludeJdk |
JDK names to exclude |
allowReflectiveAccessToJdk |
Adds --add-opens flags for Java 16+ reflective access |
Running tests with a specific JVM:
./gradlew allTests -PtestJvm=zulu11Manages multi-version Java source sets, allowing a single project to compile code targeting different JVM versions.
// In build.gradle.kts
apply(from = "$rootDir/gradle/java.gradle")
tracerJava {
addSourceSetFor(JavaVersion.VERSION_11) {
applyForTestSources = true // Apply version constraints to tests, default
}
}This creates source sets for version-specific code:
src/
main/
java/ # Java 8 code (default)
java11/ # Java 11 specific code
test/
java/ # Compiled as Java 11
Distributes test execution across multiple CI jobs using hash-based project slotting.
# Run only projects assigned to slot 2 out of 4 parallel jobs
./gradlew test -Pslot=2/4
# Without slot parameter, all projects run
./gradlew testProjects are assigned to slots based on a hash of their path, ensuring consistent distribution across builds.
Optimizes CI by skipping tests for projects unaffected by git changes.
# Only run tests for projects with changes between master and HEAD
./gradlew baseTest -PgitBaseRef=master
# Specify both refs explicitly
./gradlew baseTest -PgitBaseRef=master -PgitNewRef=feature-branchThe system:
- Detects changed files via
git diff - Maps changes to affected projects
- Skips tests for unchanged projects
- Falls back to running all tests if "global effect" files change (e.g.,
gradle/,build.gradle)
The root build.gradle.kts defines aggregate tasks that orchestrate testing across subprojects with slot filtering and git change tracking.
| Task | Projects Included |
|---|---|
baseTest / baseLatestDepTest / baseCheck |
All projects except smoke, instrumentation, profiling, debugger |
instrumentationTest / instrumentationLatestDepTest / instrumentationCheck |
:dd-java-agent:instrumentation |
smokeTest / smokeLatestDepTest / smokeCheck |
:dd-smoke-tests |
profilingTest / profilingLatestDepTest / profilingCheck |
:dd-java-agent:agent-profiling |
debuggerTest / debuggerLatestDepTest / debuggerCheck |
:dd-java-agent:agent-debugger |
Combined usage:
# Run instrumentation tests in slot 1/4, only for changed projects, with coverage
./gradlew instrumentationTest -Pslot=1/4 -PgitBaseRef=main -PcheckCoverageNote
These root tasks are defined using testAggregate() in build.gradle.kts. They combine CI slot filtering,
git change tracking, and optional JaCoCo coverage into convenient entry points for CI pipelines.
Convention plugins are the recommended way to share build logic across projects. They encapsulate common configuration patterns and can be applied like any other plugin.
Tip
Convention plugins promote consistency across modules. Instead of copy-pasting configuration, define it once and apply it everywhere.
Files ending in .gradle.kts placed in buildSrc/src/main/kotlin/ target Project and can configure tasks, dependencies, and extensions. The buildSrc/ directory is automatically included by Gradle before the main build.
In this project, convention plugins use the dd-trace-java. prefix. For example, dd-trace-java.configure-tests.gradle.kts configures test tasks across all subprojects:
// buildSrc/src/main/kotlin/dd-trace-java.configure-tests.gradle.kts (excerpt)
// Use lazy providers to avoid evaluating the property until it is needed
val skipTestsProvider = rootProject.providers.gradleProperty("skipTests")
val skipForkedTestsProvider = rootProject.providers.gradleProperty("skipForkedTests")
// Go through the Test tasks and configure them
tasks.withType<Test>().configureEach {
// Disable all tests if skipTests property was specified
onlyIf("skipTests are undefined or false") { !skipTestsProvider.isPresent }
// Set test timeout for 20 minutes
timeout = Duration.of(20, ChronoUnit.MINUTES))
// ...
}Apply it in any subproject:
// dd-java-agent/instrumentation/some-integration/build.gradle.kts
plugins {
id("dd-trace-java.configure-tests")
}Other convention plugins in this project include:
dd-trace-java.gradle-debug- Debugging utilities for build diagnosticsdd-trace-java.dependency-locking- Dependency locking configurationdd-trace-java.test-jvm-constraints- JVM constraints for test execution
Files ending in .settings.gradle.kts target Settings and can configure repository declarations, plugin management, and build structure.
// buildSrc/src/main/kotlin/my-settings-convention.settings.gradle.kts
dependencyResolutionManagement {
repositories {
mavenCentral()
}
}Warning
❌ Don't create new ones. Script plugins are deprecated. Gradle 9 documentation no longer mentions them as a recommended practice. They bring several issues.
Script plugins are standalone .gradle or .gradle.kts files that can be applied using the apply from: syntax. In this project, they are located in the gradle/ directory.
// Applying a script plugin
apply(from = "$rootDir/gradle/some-script.gradle")As warned, don't wrtite new ones, use convention plugins instead !
- No type safety: When written in Groovy DSL, you lose IDE support and compile-time checking
- Mixed DSL confusion: Projects often end up with a mix of Groovy and Kotlin scripts
- Poor discoverability: Applied scripts are harder to trace than plugin IDs
- No caching: Script plugins are re-evaluated on every build
There's an ongoing effort to migrate all of them to convention plugins for better maintainability and performance.
Each time Gradle is invoked it project must go through the Gradle Configuration phase, in which it evaluates all build scripts of the participating projects. Any expensive actions in this phase will be run every single time.
This means inefficient configuration directly impacts developer experience by slowing down all builds — regardless of which tasks actually execute. While time savings per individual task may seem modest, they compound quickly: dd-trace-java has ~630 projects and ~33,000 tasks. At that scale, even small inefficiencies add up significantly.
The solution is Gradle's lazy API: make task creation and configuration as lazy as possible, so Gradle only realizes and configures objects it actually needs to execute.
When you use eager APIs, values are computed immediately during configuration—even if the task never runs. Lazy APIs defer this work to execution time, and Gradle can automatically track dependencies between producers and consumers.
| Eager (Don't ❌) | Lazy (Prefer ✅) | Notes |
|---|---|---|
configurations.getByName("x") |
configurations.named("x") |
Returns a NamedDomainObjectProvider instead of resolving immediately |
tasks.getByName("x") |
tasks.named("x") |
Avoids triggering task creation/configuration |
tasks.findByName("x") |
tasks.named("x") |
Returns null if not found, but still realizes the task |
tasks.findByPath(":x") |
tasks.named("x") on target project |
Realizes task eagerly; use project reference with named() instead |
tasks.create("x") |
tasks.register("x") |
Task is only created when needed |
property.set(someValue) |
property.set(provider { someValue }) |
Defers computation of the value |
collection.all { } |
collection.configureEach { } |
Configures lazily as elements are realized |
collection.forEach { } |
collection.configureEach { } |
Avoids forcing realization of all elements |
file(path).exists() |
Use task inputs/outputs | Let Gradle track file dependencies |
exec { }.exitValue |
See Exec pattern below | Avoid running processes at configuration time |
Important
Any function that iterates over a Gradle collection (forEach, map, filter, all, any,
find, first, etc.) will eagerly realize all elements. This defeats lazy configuration.
Always use configureEach, or use named/withType to get lazy providers.
Warning
Groovy DSL pitfall: The shorthand syntax name { } is eager for both tasks and configurations. It calls getByName() under the hood, which realizes the element and its dependencies immediately.
// ❌ Eager - realizes the task immediately
compileLatestDepJava {
options.encoding = 'UTF-8'
}
// ✅ Lazy - configures only when needed
tasks.named('compileLatestDepJava') {
options.encoding = 'UTF-8'
}// ❌ Eager - resolves the configuration immediately
runtimeClasspath {
exclude group: 'org.slf4j'
}
// ✅ Lazy - configures only when needed
configurations.named('runtimeClasspath') {
exclude group: 'org.slf4j'
}// ❌ Eager - task is created immediately
tasks.create("processData") {
// configuration runs now, even if task is never executed
}
// ✅ Lazy - task is created only when needed
tasks.register("processData") {
// configuration runs only when this task is in the execution graph
}// ❌ Eager - resolves configuration immediately
val runtimeClasspath = configurations.getByName("runtimeClasspath")
// ✅ Lazy - returns a provider
val runtimeClasspath = configurations.named("runtimeClasspath")// ❌ Eager - forces all tasks to be created
tasks.all {
if (this is JavaCompile) {
options.encoding = "UTF-8"
}
}
// ❌ Eager - forces all tasks to be created
tasks.withType<JavaCompile>() {
options.encoding = "UTF-8"
}
// ✅ Lazy - configures each task as it's realized
tasks.withType<JavaCompile>().configureEach {
options.encoding = "UTF-8"
}// ❌ Eager - version is read immediately
tasks.register<Jar>("myJar") {
archiveVersion.set(project.version.toString())
}
// ✅ Lazy - version is read when the jar task runs
tasks.register<Jar>("myJar") {
archiveVersion.set(project.provider { project.version.toString() })
}Tip
In Gradle Kotlin DSL (and Groovy DSL) you can use = instead of set(...), e.g.
archiveVersion = project.version.toString()
A common pitfall is reading task outputs or other values during configuration. Here's how to pass values lazily to Test or JavaExec tasks using CommandLineArgumentProvider:
// ❌ Eager - task output is resolved at configuration time
tasks.withType(Test).configureEach {
def fooShadowJarTask = tasks.named('fooShadowJar', ShadowJar)
def barShadowJarTask = tasks.named('barShadowJarTask', ShadowJar)
dependsOn fooShadowJarTask, barShadowJarTask
// This resolves the archive path immediately during configuration!
systemProperty "smoketest.foo.path", fooShadowJarTask.get().archiveFile.get()
environment "BAR_PATH", barShadowJarTask.get().archiveFile.get()
}
// ✅ Lazy - use CommandLineArgumentProvider to defer resolution
tasks.withType(Test).configureEach {
def fooShadowJarTask = tasks.named('fooShadowJar', ShadowJar)
def barShadowJarTask = tasks.named('barShadowJarTask', ShadowJar)
dependsOn fooShadowJarTask, barShadowJarTask
jvmArgumentProviders.add(new CommandLineArgumentProvider() {
@Override
Iterable<String> asArguments() {
// This is only called at execution time
return fooShadowJarTask.map { ["-Dsmoketest.foo.path=${it.archiveFile.get()}"] }.get()
}
})
// Workaround: environment() calls toString() at execution time
environment("BAR_PATH", new Object() {
@Override
String toString() {
return barShadowJarTask.get().archiveFile.get().asFile.absolutePath
}
})
}Tip
CommandLineArgumentProvider is the recommended way to pass lazily-computed JVM arguments.
It's configuration-cache compatible and properly tracks inputs for up-to-date checks. However,
for older APIs like environment() that don't accept providers, use the toString() wrapper
trick: pass an anonymous object whose toString() method computes the value—it will only be
called at execution time.
- Faster configuration: Only necessary work is performed
- Automatic dependency tracking: Gradle knows which tasks produce values that others consume
- Configuration cache compatibility: Lazy providers are serializable and can be cached
- Correct ordering: Task dependencies are inferred from provider relationships
The Gradle Daemon is a long-lived background process that speeds up builds by avoiding JVM startup costs and caching project information. Configuring the Daemon JVM ensures consistent build behavior across machines.
You can specify criteria for the JVM that runs the Gradle Daemon. This is configured in gradle/gradle-daemon-jvm.properties:
# gradle/gradle-daemon-jvm.properties
toolchainVersion=21When this file exists, Gradle will automatically provision a JVM matching the criteria using toolchain resolvers. This ensures all developers and CI systems use the same JVM version to run the build, regardless of their local JAVA_HOME.
Note
The Daemon JVM is separate from the toolchain used to compile your code. The Daemon JVM runs Gradle itself, while compilation toolchains (configured via java.toolchain) compile your source files.
To change the Daemon JVM version, use the built-in updateDaemonJvm task:
# Update to a specific JVM version
./gradlew updateDaemonJvm --jvm-version=21This task updates the gradle/gradle-daemon-jvm.properties file with the new criteria. Commit this file to version control so the entire team uses the same Daemon JVM.
Caution
Not using this task will break the JDK auto-provisioning for the Gradle Daemon.
When Gradle builds fail or behave unexpectedly, several tools and techniques can help diagnose the problem.
Tip
In general following the Gradle best practices is safe bet to avoid isses.
This project uses Gradle Develocity for build scans. Build scans provide detailed insights into build performance, dependency resolution, and failures.
# Generate a build scan (requires accepting terms of use)
./gradlew build --scanOn CI, build scans are automatically published (unless SKIP_BUILDSCAN=true). The scan URL is printed at the end of the build output.
What build scans show:
- Build timeline and task execution order
- Dependency resolution details and conflicts
- Test results with failure details
- Configuration cache hits/misses
- Build cache effectiveness
For performance analysis, focus on these sections:
| Section | What to look for |
|---|---|
| Performance → Build | Total build time breakdown: configuration vs execution vs overhead |
| Performance → Configuration | Slow scripts, plugin apply times, expensive configuration logic |
| Performance → Task execution | Parallelism utilization, task wait times, serial bottlenecks |
| Timeline | Visual task execution, look for long sequential chains |
| Timeline → Critical path | Tasks that directly impact total build time—optimize these first |
| Build cache | Cache hit rates, "not cacheable" tasks that could be |
| Dependency resolution | Slow repositories, resolution time per configuration |
Tip
The critical path shows tasks that directly determine build duration. Parallelizing or speeding up tasks not on the critical path won't reduce total build time.
Access it from the Timeline section, click the search icon 🔍 on top-left, you should see a On critical path toggle to activate focus on these tasks.
Now, inspect the outcome if this task, in particular
- if the task is not
UP-TO-DATE, - if the task is not cacheable, for reasons like overlapping outputs, caching has not been enabled, more information is needed to cache this task.
Diagnosing tasks that should be UP-TO-DATE but aren't:
-
In build scans: Navigate to Timeline → click on the task → Inputs tab. Look for inputs that change unexpectedly between builds (timestamps, absolute paths, non-deterministic values). The Outcome section also shows the reason for execution (e.g., "Input property 'source' has changed").
-
With
--infoflag: Gradle logs why each task executed (same info as build scans but locally):./gradlew :my-project:compileJava --info # Look for: "Task ':my-project:compileJava' is not up-to-date because:" # - "Input property 'source' file /path/to/File.java has changed." # - "Output property 'destinationDirectory' file /path/to/classes has been removed."
-
Common causes of unexpected re-execution:
- Undeclared inputs (task reads files not tracked as inputs)
- Non-reproducible inputs (timestamps, random values, absolute paths)
- Outputs modified by another task or external process
- Missing
@PathSensitiveannotation causing full path comparison instead of relative
Gradle provides several flags to increase output verbosity:
| Flag | Description |
|---|---|
--info |
Adds informational log messages (recommended starting point) |
--debug |
Very verbose output including internal Gradle operations |
--stacktrace |
Prints stack traces for exceptions |
--full-stacktrace |
Prints full stack traces (including internal frames) |
--scan |
Generates a build scan with detailed diagnostics |
--dry-run / -m |
Shows which tasks would run without executing them |
--console=verbose |
Shows all task outcomes including UP-TO-DATE |
# Diagnose a failing task with info logging
./gradlew :my-project:test --info
# See full exception details
./gradlew build --stacktrace
# Combine for thorough diagnosis
./gradlew build --info --stacktrace --scanThis project includes a debug plugin that logs JDK information for all tasks. Enable it with -PddGradleDebug:
./gradlew build -PddGradleDebugThis writes JSON-formatted task/JDK mappings to build/datadog.gradle-debug.log:
{"task":":dd-trace-api:compileJava","jdk":"8"}
{"task":":dd-trace-api:test","jdk":"11"}Use this to diagnose JDK-related issues, especially when tasks use unexpected Java versions.
"Could not resolve dependency"
# Check dependency resolution details
./gradlew :my-project:dependencies --configuration runtimeClasspath
# Force refresh of dependencies
./gradlew build --refresh-dependencies"Task ... uses this output of task ... without declaring an explicit or implicit dependency"
This indicates a missing task dependency. The producing task must be declared as a dependency:
tasks.named("consumingTask") {
// Declare the task inputs.
inputs.files(producedPathProvider) // producedPathProvider is declared earlier in the build
// This works as well but it requires the task be already registered.
inputs.files(tasks.named("producingTask").map { it.outputs })
// You should avoid that, but if the above doesn't work, track the task dependency explicitly
dependsOn(tasks.named("producingTask"))
}Out of memory errors
Increase Gradle daemon memory in gradle.properties:
org.gradle.jvmargs=-Xmx4g -XX:+HeapDumpOnOutOfMemoryErrorDaemon seems stuck or slow
# Stop all daemons and start fresh
./gradlew --stop
./gradlew buildThe configuration cache speeds up builds by caching the task graph. However, some code patterns are incompatible.
Common configuration cache violations:
-
Capturing
Projectat execution time:// ❌ Bad - captures Project reference tasks.register("myTask") { doLast { println(project.version) // Not allowed! } } // ✅ Good - capture value at configuration time tasks.register("myTask") { val version = project.version doLast { println(version) } }
-
Using
Task.projectin task actions:// ❌ Bad tasks.register("myTask") { doLast { copy { from(project.file("src")) // Not allowed! } } } // ✅ Good - use task properties tasks.register<Copy>("myTask") { from("src") }
Diagnosing configuration cache problems:
# Run with configuration cache and see what fails
./gradlew build --configuration-cache
# Generate a detailed report
./gradlew build --configuration-cache --configuration-cache-problems=warnThe report shows exactly which code paths capture disallowed references.
# Show project structure
./gradlew projects
# List all tasks in a project
./gradlew :my-project:tasks --all
# Show applied plugins
./gradlew :my-project:buildEnvironment
# Show all resolvable configurations
./gradlew :my-project:resolvableConfigurations
# Show outgoing variants (what this project exposes)
./gradlew :my-project:outgoingVariants
# Validate build logic without running tasks
./gradlew help --scan