Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .vale/styles/Infrahub/branded-terms-case-swap.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,6 @@ swap:
(?:[Gg]itlab): GitLab
(?:gitpod): GitPod
(?:grafana): Grafana
(?:[^/][Gg]raphql): GraphQL
(?:[Ii]nflux[Dd]b): InfluxDB
infrahub(?:\s|$): Infrahub
(?:jinja2): Jinja2
Expand Down
1 change: 1 addition & 0 deletions changelog/+gql-command.added.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Add `infrahubctl graphql` commands to export schema and generate Pydantic types from GraphQL queries
56 changes: 56 additions & 0 deletions docs/docs/infrahubctl/infrahubctl-graphql.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# `infrahubctl graphql`

Various GraphQL related commands.

**Usage**:

```console
$ infrahubctl graphql [OPTIONS] COMMAND [ARGS]...
```

**Options**:

* `--install-completion`: Install completion for the current shell.
* `--show-completion`: Show completion for the current shell, to copy it or customize the installation.
* `--help`: Show this message and exit.

**Commands**:

* `export-schema`: Export the GraphQL schema to a file.
* `generate-return-types`: Create Pydantic Models for GraphQL query...

## `infrahubctl graphql export-schema`

Export the GraphQL schema to a file.

**Usage**:

```console
$ infrahubctl graphql export-schema [OPTIONS]
```

**Options**:

* `--destination PATH`: Path to the GraphQL schema file. [default: schema.graphql]
* `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml]
* `--help`: Show this message and exit.

## `infrahubctl graphql generate-return-types`

Create Pydantic Models for GraphQL query return types

**Usage**:

```console
$ infrahubctl graphql generate-return-types [OPTIONS] [QUERY]
```

**Arguments**:

* `[QUERY]`: Location of the GraphQL query file(s).

**Options**:

* `--schema PATH`: Path to the GraphQL schema file. [default: schema.graphql]
* `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml]
* `--help`: Show this message and exit.
48 changes: 48 additions & 0 deletions docs/docs/python-sdk/guides/python-typing.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -101,3 +101,51 @@ my_object = client.get(MyOwnObject, name__value="example")
```

> if you don't have your own Python module, it's possible to use relative path by having the `procotols.py` in the same directory as your script/transform/generator

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Minor typo in comment above this section.

Line 103 contains a typo: "procotols.py" should be "protocols.py".

Apply this diff to fix the typo:

-> if you don't have your own Python module, it's possible to use relative path by having the `procotols.py` in the same directory as your script/transform/generator
+> if you don't have your own Python module, it's possible to use relative path by having the `protocols.py` in the same directory as your script/transform/generator
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
> if you don't have your own Python module, it's possible to use relative path by having the `protocols.py` in the same directory as your script/transform/generator
🤖 Prompt for AI Agents
In docs/docs/python-sdk/guides/python-typing.mdx around lines 104 to 104, fix
the typo in the comment above this section by replacing "procotols.py" with the
correct filename "protocols.py".

## Generating Pydantic models from GraphQL queries
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you're working with this file can you also fix the typo with procotols.py just above that code rabbit is complaining about?


When working with GraphQL queries, you can generate type-safe Pydantic models that correspond to your query return types. This provides excellent type safety and IDE support for your GraphQL operations.

### Why use generated return types?

Generated Pydantic models from GraphQL queries offer several important benefits:

- **Type Safety**: Catch type errors at development time instead of runtime
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change:

- Catch type errors at development time instead of runtime
+ Catch type errors during development time instead of at runtime

- **IDE Support**: Get autocomplete, type hints, and better IntelliSense in your IDE
- **Documentation**: Generated models serve as living documentation of your GraphQL API
- **Validation**: Automatic validation of query responses against the expected schema

### Generating return types

Use the `infrahubctl graphql generate-return-types` command to create Pydantic models from your GraphQL queries:

```shell
# Generate models for queries in current directory
infrahubctl graphql generate-return-types

# Generate models for specific query files
infrahubctl graphql generate-return-types queries/get_devices.gql
```

> You can also export the GraphQL schema first using the `infrahubctl graphql export-schema` command:

### Example workflow

1. **Create your GraphQL queries** in `.gql` files:

2. **Generate the Pydantic models**:

```shell
infrahubctl graphql generate-return-types queries/
```

The command will generate the Python file per query based on the name of the query.

3. **Use the generated models** in your Python code

```python
from .queries.get_devices import GetDevicesQuery

response = await client.execute_graphql(query=MY_QUERY)
data = GetDevicesQuery(**response)
```
2 changes: 2 additions & 0 deletions infrahub_sdk/ctl/cli_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
from ..ctl.client import initialize_client, initialize_client_sync
from ..ctl.exceptions import QueryNotFoundError
from ..ctl.generator import run as run_generator
from ..ctl.graphql import app as graphql_app
from ..ctl.menu import app as menu_app
from ..ctl.object import app as object_app
from ..ctl.render import list_jinja2_transforms, print_template_errors
Expand Down Expand Up @@ -63,6 +64,7 @@
app.add_typer(repository_app, name="repository")
app.add_typer(menu_app, name="menu")
app.add_typer(object_app, name="object")
app.add_typer(graphql_app, name="graphql")

app.command(name="dump")(dump)
app.command(name="load")(load)
Expand Down
174 changes: 174 additions & 0 deletions infrahub_sdk/ctl/graphql.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,174 @@
from __future__ import annotations

import ast
from collections import defaultdict
from pathlib import Path
from typing import Optional

import typer
from ariadne_codegen.client_generators.package import PackageGenerator, get_package_generator
from ariadne_codegen.exceptions import ParsingError
from ariadne_codegen.plugins.explorer import get_plugins_types
from ariadne_codegen.plugins.manager import PluginManager
from ariadne_codegen.schema import (
filter_fragments_definitions,
filter_operations_definitions,
get_graphql_schema_from_path,
)
from ariadne_codegen.settings import ClientSettings, CommentsStrategy
from ariadne_codegen.utils import ast_to_str
from graphql import DefinitionNode, GraphQLSchema, NoUnusedFragmentsRule, parse, specified_rules, validate
from rich.console import Console

from ..async_typer import AsyncTyper
from ..ctl.client import initialize_client
from ..ctl.utils import catch_exception
from ..graphql.utils import insert_fragments_inline, remove_fragment_import
from .parameters import CONFIG_PARAM

app = AsyncTyper()
console = Console()

ARIADNE_PLUGINS = [
"infrahub_sdk.graphql.plugin.PydanticBaseModelPlugin",
"infrahub_sdk.graphql.plugin.FutureAnnotationPlugin",
"infrahub_sdk.graphql.plugin.StandardTypeHintPlugin",
]


def find_gql_files(query_path: Path) -> list[Path]:
"""
Find all files with .gql extension in the specified directory.
Args:
query_path: Path to the directory to search for .gql files
Returns:
List of Path objects for all .gql files found
"""
if not query_path.exists():
raise FileNotFoundError(f"Directory not found: {query_path}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could look a bit weird if query_path was a file instead of a directory. I.e. that we'd say "Directory not found" and point to a file could be confusing.


if not query_path.is_dir() and query_path.is_file():
return [query_path]

return list(query_path.glob("**/*.gql"))


def get_graphql_query(queries_path: Path, schema: GraphQLSchema) -> tuple[DefinitionNode, ...]:
"""Get graphql queries definitions from a single GraphQL file."""

if not queries_path.exists():
raise FileNotFoundError(f"File not found: {queries_path}")
if not queries_path.is_file():
raise ValueError(f"{queries_path} is not a file")

queries_str = queries_path.read_text(encoding="utf-8")
queries_ast = parse(queries_str)
validation_errors = validate(
schema=schema,
document_ast=queries_ast,
rules=[r for r in specified_rules if r is not NoUnusedFragmentsRule],
)
if validation_errors:
raise ValueError("\n\n".join(error.message for error in validation_errors))
return queries_ast.definitions
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While we don't support it yet within the SDK this would be problematic with fragments spread out in multiple files. Not an issue for this PR, just highlighting if we want to do something different with regards to the API of the command because of this.



def generate_result_types(directory: Path, package: PackageGenerator, fragment: ast.Module) -> None:
for file_name, module in package._result_types_files.items():
file_path = directory / file_name

insert_fragments_inline(module, fragment)
remove_fragment_import(module)

code = package._add_comments_to_code(ast_to_str(module), package.queries_source)
if package.plugin_manager:
code = package.plugin_manager.generate_result_types_code(code)
file_path.write_text(code)
package._generated_files.append(file_path.name)


@app.callback()
def callback() -> None:
"""
Various GraphQL related commands.
"""


@app.command()
@catch_exception(console=console)
async def export_schema(
destination: Path = typer.Option("schema.graphql", help="Path to the GraphQL schema file."),
_: str = CONFIG_PARAM,
) -> None:
"""Export the GraphQL schema to a file."""

client = initialize_client()
response = await client._get(url=f"{client.address}/schema.graphql")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest that we move this and avoid calling client._get from the CTL. Instead I's suggest that we add some client.schema.get_graphql_schema()

Do we need a branch parameter here?

destination.write_text(response.text)
console.print(f"[green]Schema exported to {destination}")
Comment on lines +99 to +110
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Validate HTTP response before writing schema file.

Line 107-108 fetches the schema and writes response.text directly to the file without checking the HTTP status code. If the server returns an error (e.g., 404, 500), the error response body will be written to the schema file, resulting in an invalid schema that will cause downstream failures.

Apply this diff to validate the response:

     client = initialize_client()
     response = await client._get(url=f"{client.address}/schema.graphql")
+    if response.status_code != 200:
+        raise ValueError(f"Failed to fetch schema: HTTP {response.status_code} - {response.text}")
+    destination.parent.mkdir(parents=True, exist_ok=True)
     destination.write_text(response.text)
     console.print(f"[green]Schema exported to {destination}")

Note on private method access:

Line 107 calls client._get(), which is a private method. While this may be necessary if the public API doesn't expose a suitable method, consider documenting this dependency or requesting a public method from the InfrahubClient maintainers.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
@app.command()
@catch_exception(console=console)
async def export_schema(
destination: Path = typer.Option("schema.graphql", help="Path to the GraphQL schema file."),
_: str = CONFIG_PARAM,
) -> None:
"""Export the GraphQL schema to a file."""
client = initialize_client()
response = await client._get(url=f"{client.address}/schema.graphql")
destination.write_text(response.text)
console.print(f"[green]Schema exported to {destination}")
@app.command()
@catch_exception(console=console)
async def export_schema(
destination: Path = typer.Option("schema.graphql", help="Path to the GraphQL schema file."),
_: str = CONFIG_PARAM,
) -> None:
"""Export the GraphQL schema to a file."""
client = initialize_client()
response = await client._get(url=f"{client.address}/schema.graphql")
if response.status_code != 200:
raise ValueError(f"Failed to fetch schema: HTTP {response.status_code} - {response.text}")
destination.parent.mkdir(parents=True, exist_ok=True)
destination.write_text(response.text)
console.print(f"[green]Schema exported to {destination}")
🤖 Prompt for AI Agents
In infrahub_sdk/ctl/graphql.py around lines 98 to 109, the export_schema command
writes response.text to the destination file without validating the HTTP
response; check the HTTP status before writing by inspecting
response.status_code (or calling response.raise_for_status()), and if the status
is not successful (e.g., not 200) print or log a clear error to console and exit
with a non-zero status instead of writing the body to the file; keep using
client._get if necessary but add a comment noting private-method dependency or
prefer a public client method if available.



@app.command()
@catch_exception(console=console)
async def generate_return_types(
query: Optional[Path] = typer.Argument(None, help="Location of the GraphQL query file(s)."),
schema: Path = typer.Option("schema.graphql", help="Path to the GraphQL schema file."),
_: str = CONFIG_PARAM,
) -> None:
"""Create Pydantic Models for GraphQL query return types"""

query = Path.cwd() if query is None else query

# Load the GraphQL schema
if not schema.exists():
raise FileNotFoundError(f"GraphQL Schema file not found: {schema}")
graphql_schema = get_graphql_schema_from_path(schema_path=str(schema))

# Initialize the plugin manager
plugin_manager = PluginManager(
schema=graphql_schema,
plugins_types=get_plugins_types(plugins_strs=ARIADNE_PLUGINS),
)

# Find the GraphQL files and organize them by directory
gql_files = find_gql_files(query)
gql_per_directory: dict[Path, list[Path]] = defaultdict(list)
for gql_file in gql_files:
gql_per_directory[gql_file.parent].append(gql_file)

# Generate the Pydantic Models for the GraphQL queries
for directory, gql_files in gql_per_directory.items():
for gql_file in gql_files:
try:
definitions = get_graphql_query(queries_path=gql_file, schema=graphql_schema)
except ValueError as exc:
print(f"Error generating result types for {gql_file}: {exc}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should use the console.print method to show output to users.

continue
queries = filter_operations_definitions(definitions)
fragments = filter_fragments_definitions(definitions)

package_generator = get_package_generator(
schema=graphql_schema,
fragments=fragments,
settings=ClientSettings(
schema_path=str(schema),
target_package_name=directory.name,
queries_path=str(directory),
include_comments=CommentsStrategy.NONE,
),
plugin_manager=plugin_manager,
)

try:
for query_operation in queries:
package_generator.add_operation(query_operation)
except ParsingError as exc:
console.print(f"[red]Unable to process {gql_file.name}: {exc}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like we ignore this error and the one above aside from displaying output. Do we want the command to fail here?

module_fragment = package_generator.fragments_generator.generate()

generate_result_types(directory=directory, package=package_generator, fragment=module_fragment)

for file_name in package_generator._result_types_files.keys():
console.print(f"[green]Generated {file_name} in {directory}")
Comment on lines +142 to +174
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Inconsistent error logging and potential issue with partial generation.

Two issues in the generation loop:

  1. Inconsistent logging: Line 144 uses print() while lines 165 and 171 use console.print(). This creates inconsistent output formatting.

  2. Partial generation after parsing error: Lines 161-165 catch ParsingError from add_operation and continue, but lines 166-168 proceed to generate types using the potentially incomplete package_generator. If some operations failed to parse, the generated types may be incomplete or incorrect, yet files are still written and success messages are printed (lines 170-171).

Apply this diff to fix both issues:

             except ValueError as exc:
-                print(f"Error generating result types for {gql_file}: {exc}")
+                console.print(f"[red]Error generating result types for {gql_file}: {exc}")
                 continue
             queries = filter_operations_definitions(definitions)
             fragments = filter_fragments_definitions(definitions)
 
             package_generator = get_package_generator(
                 schema=graphql_schema,
                 fragments=fragments,
                 settings=ClientSettings(
                     schema_path=str(schema),
                     target_package_name=directory.name,
                     queries_path=str(directory),
                     include_comments=CommentsStrategy.NONE,
                 ),
                 plugin_manager=plugin_manager,
             )
 
+            parsing_failed = False
             try:
                 for query_operation in queries:
                     package_generator.add_operation(query_operation)
             except ParsingError as exc:
                 console.print(f"[red]Unable to process {gql_file.name}: {exc}")
+                parsing_failed = True
+            
+            if parsing_failed:
+                continue
+            
             module_fragment = package_generator.fragments_generator.generate()
 
             generate_result_types(directory=directory, package=package_generator, fragment=module_fragment)
 
             for file_name in package_generator._result_types_files.keys():
                 console.print(f"[green]Generated {file_name} in {directory}")
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
for directory, gql_files in gql_per_directory.items():
for gql_file in gql_files:
try:
definitions = get_graphql_query(queries_path=gql_file, schema=graphql_schema)
except ValueError as exc:
print(f"Error generating result types for {gql_file}: {exc}")
continue
queries = filter_operations_definitions(definitions)
fragments = filter_fragments_definitions(definitions)
package_generator = get_package_generator(
schema=graphql_schema,
fragments=fragments,
settings=ClientSettings(
schema_path=str(schema),
target_package_name=directory.name,
queries_path=str(directory),
include_comments=CommentsStrategy.NONE,
),
plugin_manager=plugin_manager,
)
try:
for query_operation in queries:
package_generator.add_operation(query_operation)
except ParsingError as exc:
console.print(f"[red]Unable to process {gql_file.name}: {exc}")
module_fragment = package_generator.fragments_generator.generate()
generate_result_types(directory=directory, package=package_generator, fragment=module_fragment)
for file_name in package_generator._result_types_files.keys():
console.print(f"[green]Generated {file_name} in {directory}")
for directory, gql_files in gql_per_directory.items():
for gql_file in gql_files:
try:
definitions = get_graphql_query(queries_path=gql_file, schema=graphql_schema)
except ValueError as exc:
console.print(f"[red]Error generating result types for {gql_file}: {exc}")
continue
queries = filter_operations_definitions(definitions)
fragments = filter_fragments_definitions(definitions)
package_generator = get_package_generator(
schema=graphql_schema,
fragments=fragments,
settings=ClientSettings(
schema_path=str(schema),
target_package_name=directory.name,
queries_path=str(directory),
include_comments=CommentsStrategy.NONE,
),
plugin_manager=plugin_manager,
)
parsing_failed = False
try:
for query_operation in queries:
package_generator.add_operation(query_operation)
except ParsingError as exc:
console.print(f"[red]Unable to process {gql_file.name}: {exc}")
parsing_failed = True
if parsing_failed:
continue
module_fragment = package_generator.fragments_generator.generate()
generate_result_types(directory=directory, package=package_generator, fragment=module_fragment)
for file_name in package_generator._result_types_files.keys():
console.print(f"[green]Generated {file_name} in {directory}")
🤖 Prompt for AI Agents
In infrahub_sdk/ctl/graphql.py around lines 139 to 171, replace the plain
print() at line 144 with console.print(...) so logging is consistent, and change
the ParsingError handling so that if add_operation raises ParsingError you do
not proceed with fragment generation or writing result files — log the error via
console.print and continue to the next gql_file (or otherwise skip the remainder
of the loop for that file). Ensure the exception block returns/continues before
calling package_generator.fragments_generator.generate(),
generate_result_types(), or printing generated-file success messages.

12 changes: 12 additions & 0 deletions infrahub_sdk/graphql/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
from .constants import VARIABLE_TYPE_MAPPING
from .query import Mutation, Query
from .renderers import render_input_block, render_query_block, render_variables_to_string

__all__ = [
"VARIABLE_TYPE_MAPPING",
"Mutation",
"Query",
"render_input_block",
"render_query_block",
"render_variables_to_string",
]
1 change: 1 addition & 0 deletions infrahub_sdk/graphql/constants.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
VARIABLE_TYPE_MAPPING = ((str, "String!"), (int, "Int!"), (float, "Float!"), (bool, "Boolean!"))
Loading