diff --git a/.pydoc-markdown.yml b/.pydoc-markdown.yml
new file mode 100644
index 0000000..6a2c8bf
--- /dev/null
+++ b/.pydoc-markdown.yml
@@ -0,0 +1,9 @@
+
+ loaders:
+ - type: python
+ search_path: [lkml2cube]
+ renderer:
+ type: markdown
+ render_toc: true
+ render_module_header: true
+
\ No newline at end of file
diff --git a/CLAUDE.md b/CLAUDE.md
index ffc7866..41e599c 100644
--- a/CLAUDE.md
+++ b/CLAUDE.md
@@ -2,10 +2,31 @@
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
+> **🚨 IMPORTANT**: Always check `docs/lkml2cube_*.md` files BEFORE reading source code. Update docstrings and run `python scripts/generate_docs.py` after any code changes.
+
## Project Overview
lkml2cube is a Python CLI tool that converts LookML models into Cube data models. It uses the `lkml` library to parse LookML files and generates YAML-based Cube definitions.
+## 📚 Documentation-First Development
+
+**CRITICAL**: Before reading source code, always consult the generated documentation:
+
+1. **Primary Reference**: Check `docs/lkml2cube_*.md` files first
+2. **Implementation Details**: Read source code only if docs are insufficient
+3. **Always Update**: Maintain docstrings and regenerate docs for any changes
+
+📖 **Documentation Files**:
+- `docs/lkml2cube_main.md` - CLI commands and usage
+- `docs/lkml2cube_converter.md` - LookMLConverter Python API
+- `docs/lkml2cube_parser_cube_api.md` - Module to interact with Cube meta API
+- `docs/lkml2cube_parser_loader.md` - File loading and writing utilities
+- `docs/lkml2cube_parser_explores.md` - Module to convert LookML explores to Cube `view` definitions
+- `docs/lkml2cube_parser_types.md` - Custom YAML types for proper formatting
+- `docs/lkml2cube_parser_views.md` - Module to convert LookML views to Cube `cube` definitions
+
+⚠️ **Required Workflow**: When modifying code → Update docstrings → Run `python scripts/generate_docs.py`
+
## Development Commands
### Environment Setup
@@ -15,9 +36,17 @@ lkml2cube is a Python CLI tool that converts LookML models into Cube data models
### Testing
- Tests are located in `tests/` directory
-- Main test file: `tests/test_e2e.py`
+- Main test files: `tests/test_e2e.py`, `tests/test_converter.py`, `tests/test_explores_command.py`
- Test samples are in `tests/samples/` with both `lkml/` and `cubeml/` subdirectories
- Tests compare generated output against expected YAML files
+- `test_converter.py` provides comprehensive unit tests for the `LookMLConverter` class
+
+### Documentation Generation
+- **Generate docs**: `python scripts/generate_docs.py`
+- **MANDATORY**: Run after any function/method changes
+- **Output**: Updates `docs/lkml2cube_*.md` files
+- **Fallback**: Uses manual generation if pydoc-markdown fails
+- **Optimization**: Optimizes output for LLM consumption
### CLI Usage
The tool provides three main commands:
@@ -36,8 +65,9 @@ The tool provides three main commands:
- `cube_api.py` - Interfaces with Cube meta API, correctly separates cubes vs views
- `types.py` - Custom YAML types for proper formatting
-#### Main Entry Point
+#### Main Entry Points
- `main.py` - Typer-based CLI with three commands: cubes, views, explores
+- `converter.py` - Python API class `LookMLConverter` for programmatic usage
- Uses Rich for console output formatting
### Key Concepts
@@ -67,4 +97,193 @@ Common options across commands:
- `--parseonly` - Shows parsed LookML as Python dict
- `--printonly` - Prints generated YAML to stdout
- `--outputdir` - Directory for output files
-- `--rootdir` - Base path for resolving includes
\ No newline at end of file
+- `--rootdir` - Base path for resolving includes
+
+## Python API Usage
+
+### LookMLConverter Class
+The `LookMLConverter` class provides a Python API for programmatic usage without requiring CLI interaction:
+
+```python
+from lkml2cube.converter import LookMLConverter
+
+# Initialize with configuration
+converter = LookMLConverter(
+ outputdir="/tmp/output",
+ rootdir="/lookml/models",
+ parseonly=False,
+ printonly=False,
+ use_explores_name=False
+)
+
+# Convert LookML views to Cube definitions
+result = converter.cubes("models/orders.lkml")
+
+# Convert LookML explores to Cube definitions with views
+result = converter.views("models/explores.lkml")
+
+# Generate LookML explores from Cube meta API
+result = converter.explores("https://api.cube.dev/v1/meta", "jwt-token")
+```
+
+#### Key Methods
+- `cubes(file_path)` - Equivalent to `lkml2cube cubes` command
+- `views(file_path)` - Equivalent to `lkml2cube views` command
+- `explores(metaurl, token)` - Equivalent to `lkml2cube explores` command
+- `set_config(**kwargs)` - Update configuration options
+- `get_config()` - Get current configuration
+- `validate_files(file_paths)` - Validate that files can be loaded
+
+#### Configuration Management
+The converter maintains state and can be reconfigured:
+
+```python
+# Update configuration
+converter.set_config(parseonly=True, outputdir="/new/path")
+
+# Get current configuration
+config = converter.get_config()
+```
+
+#### Return Values
+All methods return a dictionary with relevant data:
+- `parseonly=True`: Returns parsed model structure
+- `printonly=True`: Returns YAML output string
+- Default: Returns file generation summary
+
+## Documentation and Code Guidelines
+
+### Documentation-First Approach
+
+**IMPORTANT**: Always refer to the generated documentation in the `docs/` directory before reading source code:
+
+1. **First**: Check `docs/lkml2cube_*.md` files for API documentation
+2. **Second**: If implementation details are needed, then read the source code
+3. **Always**: Maintain and update documentation when making changes
+
+### Documentation Files
+
+The project maintains auto-generated documentation:
+- `docs/lkml2cube_main.md` - CLI commands and usage
+- `docs/lkml2cube_converter.md` - LookMLConverter Python API
+- `docs/lkml2cube_parser_cube_api.md` - Module to interact with Cube meta API
+- `docs/lkml2cube_parser_loader.md` - File loading and writing utilities
+- `docs/lkml2cube_parser_explores.md` - Module to convert LookML explores to Cube `view` definitions
+- `docs/lkml2cube_parser_types.md` - Custom YAML types for proper formatting
+- `docs/lkml2cube_parser_views.md` - Module to convert LookML views to Cube `cube` definitions
+
+### Documentation Maintenance Workflow
+
+**MANDATORY**: When adding new functions or modifying existing ones:
+
+1. **Add/Update Google-style Docstrings**:
+ ```python
+ def my_function(param1: str, param2: int = 5) -> dict:
+ """Brief one-line description.
+
+ Detailed description if needed.
+
+ Args:
+ param1 (str): Description of param1.
+ param2 (int, optional): Description of param2. Defaults to 5.
+
+ Returns:
+ dict: Description of return value.
+
+ Raises:
+ ValueError: Description of when this is raised.
+
+ Example:
+ >>> result = my_function("test", 10)
+ >>> print(result['key'])
+ 'value'
+ """
+ ```
+
+2. **Run Documentation Generation**:
+ ```bash
+ python scripts/generate_docs.py
+ ```
+
+3. **Verify Documentation**:
+ - Check that `docs/` files are updated
+ - Ensure docstrings are properly formatted
+ - Verify examples work correctly
+
+### Google-style Docstring Requirements
+
+All public functions, methods, and classes MUST have Google-style docstrings including:
+- **Clear one-line description**
+- **Complete parameter documentation with types**
+- **Return value descriptions**
+- **Raised exceptions**
+- **Simple usage examples**
+
+### Documentation Generation Script
+
+The `scripts/generate_docs.py` script:
+- Automatically extracts docstrings from source code
+- Generates markdown files in `docs/` directory
+- Uses pydoc-markdown with manual fallback
+- Optimizes output for LLM consumption
+- Must be run after any function signature changes
+
+### When to Update Documentation
+
+Run `python scripts/generate_docs.py` when:
+- Adding new functions or methods
+- Modifying function signatures
+- Changing parameter types or defaults
+- Adding or removing classes
+- Updating docstrings for clarity
+
+### Code Review Checklist
+
+Before committing changes:
+- [ ] All new functions have Google-style docstrings
+- [ ] Documentation generation script has been run
+- [ ] Generated docs reflect the changes
+- [ ] Examples in docstrings are accurate
+- [ ] Parameter types and descriptions are correct
+
+## 🔒 Enforcement Rules
+
+**MANDATORY REQUIREMENTS**:
+
+1. **Documentation First**: NEVER read source code without first checking the generated documentation in `docs/`
+2. **Google-style Docstrings**: ALL public functions, methods, and classes MUST have complete Google-style docstrings
+3. **Documentation Generation**: ALWAYS run `python scripts/generate_docs.py` after any code changes
+4. **No Exceptions**: These rules apply to ALL code changes, no matter how small
+
+**VIOLATION CONSEQUENCES**:
+- Code changes without proper docstrings will be rejected
+- Failure to generate documentation will result in incomplete assistance
+- Not following documentation-first approach will lead to suboptimal code understanding
+
+**COMPLIANCE VERIFICATION**:
+- Check that `docs/` files are updated after changes
+- Verify docstrings follow Google-style format exactly
+- Ensure examples in docstrings are working and accurate
+- Confirm all parameters and return values are documented
+
+---
+
+## 📝 Quick Reference
+
+**Documentation Workflow**:
+1. 📖 Check `docs/lkml2cube_*.md` first
+2. 📝 Add/update Google-style docstrings
+3. 🔄 Run `python scripts/generate_docs.py`
+4. ✅ Verify documentation is updated
+
+**Key Files**:
+- `docs/lkml2cube_main.md` - CLI documentation
+- `docs/lkml2cube_converter.md` - Python API documentation
+- `docs/lkml2cube_parser_cube_api.md` - Module to interact with Cube meta API
+- `docs/lkml2cube_parser_loader.md` - File loading and writing utilities
+- `docs/lkml2cube_parser_explores.md` - Module to convert LookML explores to Cube `view` definitions
+- `docs/lkml2cube_parser_types.md` - Custom YAML types for proper formatting
+- `docs/lkml2cube_parser_views.md` - Module to convert LookML views to Cube `cube` definitions
+- `scripts/generate_docs.py` - Documentation generation script
+
+**Remember**: Documentation first, code second. Always maintain docstrings!
\ No newline at end of file
diff --git a/README.md b/README.md
index 2745ff5..e060a8f 100644
--- a/README.md
+++ b/README.md
@@ -16,11 +16,19 @@ A comprehensive tool for bidirectional conversion between LookML and Cube data m
pip install lkml2cube
```
-## Commands
+## Usage
+
+lkml2cube can be used both as a command-line tool and as a Python library:
+
+### Command Line Interface
+
+Use the CLI commands for quick conversions and automation:
+
+#### Commands
lkml2cube provides three main commands for different conversion scenarios:
-### 1. `cubes` - LookML Views → Cube Models
+##### 1. `cubes` - LookML Views → Cube Models
Converts LookML view files into Cube YAML definitions (cubes only).
@@ -35,7 +43,7 @@ lkml2cube cubes --parseonly path/to/orders.view.lkml
lkml2cube cubes views/orders.view.lkml --outputdir models/ --rootdir ../my_project/
```
-### 2. `views` - LookML Explores → Cube Models
+##### 2. `views` - LookML Explores → Cube Models
Converts LookML explore files into Cube YAML definitions (cubes + views with joins).
@@ -47,7 +55,7 @@ lkml2cube views path/to/sales_analysis.explore.lkml --outputdir examples/
lkml2cube views --printonly path/to/sales_analysis.explore.lkml
```
-### 3. `explores` - Cube Meta API → LookML ✨ **NEW**
+##### 3. `explores` - Cube Meta API → LookML ✨ **NEW**
Generates production-ready LookML files from Cube's meta API endpoint.
@@ -68,6 +76,79 @@ lkml2cube explores "https://your-cube.com/cubejs-api/v1/meta" \
--printonly
```
+### Python API
+
+For programmatic usage, import and use the `LookMLConverter` class:
+
+```python
+from lkml2cube.converter import LookMLConverter
+
+# Initialize converter with options
+converter = LookMLConverter(
+ outputdir="./output",
+ rootdir="./models",
+ parseonly=False,
+ printonly=False,
+ use_explores_name=False
+)
+
+# Convert LookML views to Cube definitions
+result = converter.cubes("path/to/orders.view.lkml")
+print(f"Generated {len(result['summary']['cubes'])} cube files")
+
+# Convert LookML explores to Cube definitions with views
+result = converter.views("path/to/explores.lkml")
+print(f"Generated {len(result['summary']['views'])} view files")
+
+# Generate LookML from Cube API
+result = converter.explores("https://api.cube.dev/v1/meta", "jwt-token")
+print(f"Generated {len(result['summary']['views'])} LookML views")
+```
+
+#### Configuration Management
+
+The converter maintains state and can be reconfigured:
+
+```python
+# Update configuration
+converter.set_config(parseonly=True, outputdir="/tmp/new-output")
+
+# Get current configuration
+config = converter.get_config()
+print(f"Current output directory: {config['outputdir']}")
+
+# Validate files before processing
+file_paths = ["model1.lkml", "model2.lkml"]
+validation_results = converter.validate_files(file_paths)
+valid_files = [f for f, valid in validation_results.items() if valid]
+```
+
+#### Return Values
+
+All conversion methods return a dictionary with:
+
+- **parseonly=True**: `{'lookml_model': dict, 'parsed_model': str}`
+- **printonly=True**: `{'lookml_model': dict, 'cube_def': dict, 'yaml_output': str}`
+- **Default**: `{'lookml_model': dict, 'cube_def': dict, 'summary': dict}`
+
+The `summary` contains details about generated files:
+
+```python
+{
+ 'cubes': [{'name': 'orders', 'path': '/output/cubes/orders.yml'}],
+ 'views': [{'name': 'orders_view', 'path': '/output/views/orders_view.yml'}]
+}
+```
+
+#### Why Use the Python API?
+
+- **State Management**: Maintain configuration across multiple conversions
+- **Programmatic Control**: Integrate conversions into data pipelines
+- **Validation**: Check file validity before processing
+- **Error Handling**: Catch and handle conversion errors gracefully
+- **Batch Processing**: Process multiple files efficiently
+- **Custom Workflows**: Build complex conversion workflows
+
## What Gets Generated
### From Cube Cubes → LookML Views
@@ -140,6 +221,7 @@ explore order_analysis {
The tool automatically handles LookML `include` statements and can resolve relative paths:
+**CLI:**
```sh
# Use --rootdir to resolve include paths
lkml2cube views explores/sales.explore.lkml \
@@ -147,10 +229,21 @@ lkml2cube views explores/sales.explore.lkml \
--rootdir /path/to/lookml/project/
```
+**Python API:**
+```python
+# Set rootdir for include resolution
+converter = LookMLConverter(
+ rootdir="/path/to/lookml/project/",
+ outputdir="output/"
+)
+result = converter.views("explores/sales.explore.lkml")
+```
+
### Authentication for Cube API
The `explores` command requires a valid JWT token for Cube authentication:
+**CLI:**
```sh
# Get your token from Cube's authentication
export CUBE_TOKEN="eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."
@@ -160,6 +253,83 @@ lkml2cube explores "https://your-cube.com/cubejs-api/v1/meta" \
--outputdir looker_models/
```
+**Python API:**
+```python
+# Use environment variables or pass token directly
+import os
+converter = LookMLConverter(outputdir="looker_models/")
+result = converter.explores(
+ "https://your-cube.com/cubejs-api/v1/meta",
+ os.getenv("CUBE_TOKEN")
+)
+```
+
+### Batch Processing
+
+The Python API makes it easy to process multiple files:
+
+```python
+from lkml2cube.converter import LookMLConverter
+from pathlib import Path
+
+converter = LookMLConverter(outputdir="output/")
+
+# Process all LookML files in a directory
+lookml_dir = Path("models/")
+for lkml_file in lookml_dir.glob("*.lkml"):
+ try:
+ print(f"Processing {lkml_file}...")
+ result = converter.cubes(str(lkml_file))
+ print(f" ✓ Generated {len(result['summary']['cubes'])} cubes")
+ except Exception as e:
+ print(f" ✗ Error processing {lkml_file}: {e}")
+
+# Validate files before processing
+file_paths = [str(f) for f in lookml_dir.glob("*.lkml")]
+validation_results = converter.validate_files(file_paths)
+valid_files = [f for f, valid in validation_results.items() if valid]
+print(f"Found {len(valid_files)} valid LookML files")
+```
+
+### Pipeline Integration
+
+Integrate lkml2cube into your data pipeline:
+
+```python
+from lkml2cube.converter import LookMLConverter
+import logging
+
+def sync_cube_to_lookml(cube_api_url: str, token: str, output_dir: str):
+ """Sync Cube models to LookML files."""
+ converter = LookMLConverter(outputdir=output_dir)
+
+ try:
+ # Generate LookML from Cube API
+ result = converter.explores(cube_api_url, token)
+
+ # Log results
+ views_count = len(result['summary']['views'])
+ explores_count = len(result['summary']['explores'])
+
+ logging.info(f"Generated {views_count} LookML views")
+ logging.info(f"Generated {explores_count} LookML explores")
+
+ return result['summary']
+
+ except Exception as e:
+ logging.error(f"Failed to sync Cube to LookML: {e}")
+ raise
+
+# Use in your pipeline
+if __name__ == "__main__":
+ summary = sync_cube_to_lookml(
+ "https://your-cube.com/cubejs-api/v1/meta",
+ "your-jwt-token",
+ "looker_models/"
+ )
+ print(f"Sync complete: {summary}")
+```
+
## Output Structure
The tool creates organized directory structures:
diff --git a/docs/lkml2cube_converter.md b/docs/lkml2cube_converter.md
new file mode 100644
index 0000000..d3e05ac
--- /dev/null
+++ b/docs/lkml2cube_converter.md
@@ -0,0 +1,304 @@
+# Table of Contents
+
+* [lkml2cube.converter](#lkml2cube.converter)
+ * [LookMLConverter](#lkml2cube.converter.LookMLConverter)
+ * [\_\_init\_\_](#lkml2cube.converter.LookMLConverter.__init__)
+ * [cubes](#lkml2cube.converter.LookMLConverter.cubes)
+ * [views](#lkml2cube.converter.LookMLConverter.views)
+ * [explores](#lkml2cube.converter.LookMLConverter.explores)
+ * [set\_config](#lkml2cube.converter.LookMLConverter.set_config)
+ * [get\_config](#lkml2cube.converter.LookMLConverter.get_config)
+ * [validate\_files](#lkml2cube.converter.LookMLConverter.validate_files)
+ * [clear\_cache](#lkml2cube.converter.LookMLConverter.clear_cache)
+ * [\_\_repr\_\_](#lkml2cube.converter.LookMLConverter.__repr__)
+
+
+
+# lkml2cube.converter
+
+Main converter class for lkml2cube providing a Python API for LookML to Cube conversion.
+
+This module provides a high-level interface for converting LookML models to Cube definitions
+without requiring CLI usage. It maintains configuration state and provides the same
+functionality as the CLI commands.
+
+
+
+## LookMLConverter Objects
+
+```python
+class LookMLConverter()
+```
+
+Main converter class for LookML to Cube conversion operations.
+
+This class provides a Python API for converting LookML models to Cube definitions,
+maintaining configuration state and providing the same functionality as the CLI commands.
+
+**Attributes**:
+
+- `outputdir` _str_ - Directory where output files will be written.
+- `rootdir` _str | None_ - Root directory for resolving LookML includes.
+- `parseonly` _bool_ - If True, only parse and return Python dict representation.
+- `printonly` _bool_ - If True, print YAML output to stdout instead of writing files.
+- `use_explores_name` _bool_ - Whether to use explore names for cube view names.
+
+
+
+#### \_\_init\_\_
+
+```python
+def __init__(outputdir: str = ".",
+ rootdir: Optional[str] = None,
+ parseonly: bool = False,
+ printonly: bool = False,
+ use_explores_name: bool = False)
+```
+
+Initialize the LookML converter with configuration options.
+
+**Arguments**:
+
+- `outputdir` _str, optional_ - Directory where output files will be written. Defaults to ".".
+- `rootdir` _str | None, optional_ - Root directory for resolving LookML includes. Defaults to None.
+- `parseonly` _bool, optional_ - If True, only parse and return Python dict representation. Defaults to False.
+- `printonly` _bool, optional_ - If True, print YAML output to stdout instead of writing files. Defaults to False.
+- `use_explores_name` _bool, optional_ - Whether to use explore names for cube view names. Defaults to False.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter(outputdir="/tmp/output", rootdir="/lookml/models")
+ >>> result = converter.cubes("models/*.lkml")
+ >>> print(result['summary']['cubes'][0]['name'])
+ 'orders'
+
+
+
+#### cubes
+
+```python
+def cubes(file_path: str) -> Dict[str, Any]
+```
+
+Generate cube definitions from LookML views.
+
+Converts LookML views into Cube cube definitions, handling dimensions, measures,
+and basic join relationships.
+
+**Arguments**:
+
+- `file_path` _str_ - Path to LookML file(s) to process (supports glob patterns).
+
+
+**Returns**:
+
+- `dict` - Result dictionary containing:
+ - 'lookml_model': Parsed LookML model (if parseonly=True)
+ - 'cube_def': Generated cube definitions
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+
+**Raises**:
+
+- `ValueError` - If no files are found at the specified path.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter()
+ >>> result = converter.cubes("models/orders.lkml")
+ >>> print(result['cube_def']['cubes'][0]['name'])
+ 'orders'
+
+
+
+#### views
+
+```python
+def views(file_path: str) -> Dict[str, Any]
+```
+
+Generate cube definitions with views from LookML explores.
+
+Converts LookML explores into Cube definitions including both cubes and views
+with join relationships.
+
+**Arguments**:
+
+- `file_path` _str_ - Path to LookML file(s) to process (supports glob patterns).
+
+
+**Returns**:
+
+- `dict` - Result dictionary containing:
+ - 'lookml_model': Parsed LookML model (if parseonly=True)
+ - 'cube_def': Generated cube definitions with views
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+
+**Raises**:
+
+- `ValueError` - If no files are found at the specified path.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter(use_explores_name=True)
+ >>> result = converter.views("models/explores.lkml")
+ >>> print(len(result['cube_def']['views']))
+ 2
+
+
+
+#### explores
+
+```python
+def explores(metaurl: str, token: str) -> Dict[str, Any]
+```
+
+Generate LookML explores from Cube meta API.
+
+Fetches Cube model from meta API and converts it to LookML explores,
+correctly mapping Cube cubes to LookML views and Cube views to LookML explores.
+
+**Arguments**:
+
+- `metaurl` _str_ - URL to the Cube meta API endpoint.
+- `token` _str_ - JWT token for Cube meta API authentication.
+
+
+**Returns**:
+
+- `dict` - Result dictionary containing:
+ - 'cube_model': Raw Cube model from meta API (if parseonly=True)
+ - 'lookml_model': Converted LookML model
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+
+**Raises**:
+
+- `ValueError` - If no response is received from the meta API.
+- `Exception` - If API request fails or token is invalid.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter(outputdir="/tmp/lookml")
+ >>> result = converter.explores("https://api.cube.dev/v1/meta", "jwt-token")
+ >>> print(len(result['lookml_model']['explores']))
+ 3
+
+
+
+#### set\_config
+
+```python
+def set_config(outputdir: Optional[str] = None,
+ rootdir: Optional[str] = None,
+ parseonly: Optional[bool] = None,
+ printonly: Optional[bool] = None,
+ use_explores_name: Optional[bool] = None) -> None
+```
+
+Update converter configuration options.
+
+**Arguments**:
+
+- `outputdir` _str | None, optional_ - Directory where output files will be written.
+- `rootdir` _str | None, optional_ - Root directory for resolving LookML includes.
+- `parseonly` _bool | None, optional_ - If True, only parse and return Python dict representation.
+- `printonly` _bool | None, optional_ - If True, print YAML output to stdout instead of writing files.
+- `use_explores_name` _bool | None, optional_ - Whether to use explore names for cube view names.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter()
+ >>> converter.set_config(outputdir="/new/path", parseonly=True)
+ >>> result = converter.cubes("models/*.lkml")
+ # Will now parse only and use the new output directory
+
+
+
+#### get\_config
+
+```python
+def get_config() -> Dict[str, Any]
+```
+
+Get current converter configuration.
+
+**Returns**:
+
+- `dict` - Current configuration settings.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter(outputdir="/tmp")
+ >>> config = converter.get_config()
+ >>> print(config['outputdir'])
+ '/tmp'
+
+
+
+#### validate\_files
+
+```python
+def validate_files(file_paths: List[str]) -> Dict[str, bool]
+```
+
+Validate that LookML files exist and can be loaded.
+
+**Arguments**:
+
+- `file_paths` _list[str]_ - List of file paths to validate.
+
+
+**Returns**:
+
+- `dict` - Dictionary mapping file paths to validation results.
+
+
+**Example**:
+
+ >>> converter = LookMLConverter()
+ >>> results = converter.validate_files(["models/orders.lkml", "models/missing.lkml"])
+ >>> print(results["models/orders.lkml"])
+ True
+
+
+
+#### clear\_cache
+
+```python
+def clear_cache() -> None
+```
+
+Clear the global file loader cache.
+
+clears the visited_path cache used by the file_loader to prevent
+circular includes. Useful for ensuring clean state between operations or
+in testing scenarios.
+
+**Example**:
+
+ >>> converter = LookMLConverter()
+ >>> converter.cubes("models/orders.lkml") # Populates cache
+ >>> converter.clear_cache() # Clears cache
+ >>> converter.cubes("models/orders.lkml") # Loads fresh from disk
+
+
+
+#### \_\_repr\_\_
+
+```python
+def __repr__() -> str
+```
+
+Return string representation of the converter.
+
diff --git a/docs/lkml2cube_main.md b/docs/lkml2cube_main.md
new file mode 100644
index 0000000..4ac4079
--- /dev/null
+++ b/docs/lkml2cube_main.md
@@ -0,0 +1,223 @@
+# Table of Contents
+
+* [lkml2cube.main](#lkml2cube.main)
+ * [callback](#lkml2cube.main.callback)
+ * [cubes](#lkml2cube.main.cubes)
+ * [views](#lkml2cube.main.views)
+ * [explores](#lkml2cube.main.explores)
+
+
+
+# lkml2cube.main
+
+Main CLI module for lkml2cube - LookML to Cube bidirectional converter.
+
+This module provides the command-line interface for lkml2cube, offering three main commands:
+- cubes: Convert LookML views to Cube definitions
+- views: Convert LookML explores to Cube definitions with views
+- explores: Generate LookML explores from Cube meta API
+
+The CLI is built using Typer and provides rich console output with proper error handling.
+Each command supports various options for parsing, output formatting, and file generation.
+
+
+
+#### callback
+
+```python
+@app.callback()
+def callback()
+```
+
+Main callback function for the lkml2cube CLI application.
+
+serves as the entry point for the CLI and provides
+general information about the tool. It sets up the global context
+for all subcommands.
+
+**Notes**:
+
+ is called automatically by Typer when the CLI is invoked.
+ It doesn't perform any specific actions but serves as a placeholder
+ for global CLI configuration.
+
+
+**Example**:
+
+ $ lkml2cube --help
+ # Shows help information for the entire CLI
+
+
+
+#### cubes
+
+```python
+@app.command()
+def cubes(
+ file_path: Annotated[str,
+ typer.Argument(help="The path for the file to read")],
+ parseonly: Annotated[
+ bool,
+ typer.Option(help=("When present it will only show the python"
+ " dict read from the lookml file")),
+ ] = False,
+ outputdir: Annotated[
+ str,
+ typer.Option(
+ help="The path for the output files to be generated")] = ".",
+ printonly: Annotated[bool,
+ typer.Option(
+ help="Print to stdout the parsed files")] = False,
+ rootdir: Annotated[
+ str,
+ typer.Option(help="The path to prepend to include paths")] = None)
+```
+
+Generate Cube model definitions from LookML view files.
+
+Converts LookML view files into Cube YAML definitions, handling dimensions,
+measures, and basic join relationships. This command focuses on generating
+cube definitions only (no views).
+
+**Arguments**:
+
+- `file_path` _str_ - Path to the LookML file to process (supports glob patterns).
+- `parseonly` _bool, optional_ - If True, only displays the parsed LookML as Python dict. Defaults to False.
+- `outputdir` _str, optional_ - Directory where output files will be written. Defaults to ".".
+- `printonly` _bool, optional_ - If True, prints YAML to stdout instead of writing files. Defaults to False.
+- `rootdir` _str | None, optional_ - Root directory for resolving LookML includes. Defaults to None.
+
+
+**Raises**:
+
+- `typer.Exit` - If no files are found at the specified path.
+
+
+**Example**:
+
+ $ lkml2cube cubes models/orders.view.lkml --outputdir output/
+ # Generates cube definitions in output/cubes/
+
+ $ lkml2cube cubes models/orders.view.lkml --parseonly
+ # Shows parsed LookML structure
+
+ $ lkml2cube cubes models/orders.view.lkml --printonly
+ # Prints YAML to console
+
+
+
+#### views
+
+```python
+@app.command()
+def views(
+ file_path: Annotated[str,
+ typer.Argument(
+ help="The path for the explore to read")],
+ parseonly: Annotated[
+ bool,
+ typer.Option(help=("When present it will only show the python"
+ " dict read from the lookml file")),
+ ] = False,
+ outputdir: Annotated[
+ str,
+ typer.Option(
+ help="The path for the output files to be generated")] = ".",
+ printonly: Annotated[bool,
+ typer.Option(
+ help="Print to stdout the parsed files")] = False,
+ rootdir: Annotated[
+ str, typer.Option(help="The path to prepend to include paths")] = None,
+ use_explores_name: Annotated[
+ bool,
+ typer.Option(help="Use explore names for cube view names")] = False)
+```
+
+Generate Cube model definitions with views from LookML explore files.
+
+Converts LookML explore files into Cube YAML definitions, creating both
+cube definitions and view definitions with join relationships. This command
+generates a complete Cube model with views that define how cubes relate to each other.
+
+**Arguments**:
+
+- `file_path` _str_ - Path to the LookML explore file to process (supports glob patterns).
+- `parseonly` _bool, optional_ - If True, only displays the parsed LookML as Python dict. Defaults to False.
+- `outputdir` _str, optional_ - Directory where output files will be written. Defaults to ".".
+- `printonly` _bool, optional_ - If True, prints YAML to stdout instead of writing files. Defaults to False.
+- `rootdir` _str | None, optional_ - Root directory for resolving LookML includes. Defaults to None.
+- `use_explores_name` _bool, optional_ - If True, uses explore names for cube view names. Defaults to False.
+
+
+**Raises**:
+
+- `typer.Exit` - If no files are found at the specified path.
+
+
+**Example**:
+
+ $ lkml2cube views models/explores.lkml --outputdir output/
+ # Generates cubes and views in output/cubes/ and output/views/
+
+ $ lkml2cube views models/explores.lkml --use-explores-name
+ # Uses explore names for view naming
+
+ $ lkml2cube views models/explores.lkml --parseonly
+ # Shows parsed LookML structure
+
+
+
+#### explores
+
+```python
+@app.command()
+def explores(
+ metaurl: Annotated[str,
+ typer.Argument(help="The url for cube meta endpoint")],
+ token: Annotated[str, typer.Option(help="JWT token for Cube meta")],
+ parseonly: Annotated[
+ bool,
+ typer.Option(help=("When present it will only show the python"
+ " dict read from the lookml file")),
+ ] = False,
+ outputdir: Annotated[
+ str,
+ typer.Option(
+ help="The path for the output files to be generated")] = ".",
+ printonly: Annotated[
+ bool, typer.Option(help="Print to stdout the parsed files")] = False)
+```
+
+Generate LookML explores and views from Cube meta API.
+
+Fetches Cube model definitions from the meta API and converts them to
+production-ready LookML files. This command correctly maps:
+- Cube cubes (with sql_table/sql) → LookML views
+- Cube views (with aliasMember joins) → LookML explores
+
+**Arguments**:
+
+- `metaurl` _str_ - URL to the Cube meta API endpoint (e.g., https://api.cube.dev/v1/meta).
+- `token` _str_ - JWT authentication token for the Cube meta API.
+- `parseonly` _bool, optional_ - If True, only displays the parsed Cube model as Python dict. Defaults to False.
+- `outputdir` _str, optional_ - Directory where output files will be written. Defaults to ".".
+- `printonly` _bool, optional_ - If True, prints YAML to stdout instead of writing files. Defaults to False.
+
+
+**Raises**:
+
+- `typer.Exit` - If no response is received from the meta API.
+- `ValueError` - If the token is invalid or API request fails.
+
+
+**Example**:
+
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --outputdir lookml/
+ # Generates LookML views and explores in lookml/views/ and lookml/explores/
+
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --parseonly
+ # Shows parsed Cube model structure
+
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --printonly
+ # Prints generated LookML to console
+
diff --git a/docs/lkml2cube_parser_cube_api.md b/docs/lkml2cube_parser_cube_api.md
new file mode 100644
index 0000000..7e8b9f1
--- /dev/null
+++ b/docs/lkml2cube_parser_cube_api.md
@@ -0,0 +1,101 @@
+# Table of Contents
+
+* [lkml2cube.parser.cube\_api](#lkml2cube.parser.cube_api)
+ * [meta\_loader](#lkml2cube.parser.cube_api.meta_loader)
+ * [parse\_members](#lkml2cube.parser.cube_api.parse_members)
+ * [parse\_meta](#lkml2cube.parser.cube_api.parse_meta)
+
+
+
+# lkml2cube.parser.cube\_api
+
+
+
+#### meta\_loader
+
+```python
+def meta_loader(meta_url: str, token: str) -> dict
+```
+
+Load the Cube meta API and return the model as a dictionary.
+
+**Arguments**:
+
+- `meta_url` _str_ - URL to the Cube meta API endpoint.
+- `token` _str_ - Authentication token for the API.
+
+
+**Returns**:
+
+- `dict` - Cube model data from the meta API.
+
+
+**Raises**:
+
+- `ValueError` - If no valid token is provided.
+- `Exception` - If the API request fails or returns non-200 status.
+
+
+**Example**:
+
+ >>> model = meta_loader('https://api.cube.dev/v1/meta', 'my-token')
+ >>> print(model['cubes'][0]['name'])
+ 'orders'
+
+
+
+#### parse\_members
+
+```python
+def parse_members(members: list) -> list
+```
+
+Parse measures and dimensions from the Cube meta model.
+
+**Arguments**:
+
+- `members` _list_ - List of dimension or measure definitions from Cube meta.
+
+
+**Returns**:
+
+- `list` - List of parsed members in LookML format.
+
+
+**Example**:
+
+ >>> members = [{'name': 'total_sales', 'type': 'sum', 'sql': 'amount'}]
+ >>> parsed = parse_members(members)
+ >>> print(parsed[0]['name'])
+ 'total_sales'
+
+
+
+#### parse\_meta
+
+```python
+def parse_meta(cube_model: dict) -> dict
+```
+
+Parse the Cube meta model and return a simplified version.
+
+Separates Cube cubes (-> LookML views) from Cube views (-> LookML explores).
+
+**Arguments**:
+
+- `cube_model` _dict_ - Complete Cube model from meta API.
+
+
+**Returns**:
+
+- `dict` - LookML model with structure:
+- `{'views'` - list, 'explores': list}
+
+
+**Example**:
+
+ >>> cube_model = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> lookml_model = parse_meta(cube_model)
+ >>> print(lookml_model['views'][0]['name'])
+ 'orders'
+
diff --git a/docs/lkml2cube_parser_explores.md b/docs/lkml2cube_parser_explores.md
new file mode 100644
index 0000000..bb17e62
--- /dev/null
+++ b/docs/lkml2cube_parser_explores.md
@@ -0,0 +1,255 @@
+# Table of Contents
+
+* [lkml2cube.parser.explores](#lkml2cube.parser.explores)
+ * [snakify](#lkml2cube.parser.explores.snakify)
+ * [build\_cube\_name\_look\_up](#lkml2cube.parser.explores.build_cube_name_look_up)
+ * [get\_cube\_from\_cube\_def](#lkml2cube.parser.explores.get_cube_from_cube_def)
+ * [get\_cube\_names\_from\_join\_condition](#lkml2cube.parser.explores.get_cube_names_from_join_condition)
+ * [traverse\_graph](#lkml2cube.parser.explores.traverse_graph)
+ * [generate\_cube\_joins](#lkml2cube.parser.explores.generate_cube_joins)
+ * [generate\_cube\_views](#lkml2cube.parser.explores.generate_cube_views)
+ * [parse\_explores](#lkml2cube.parser.explores.parse_explores)
+
+
+
+# lkml2cube.parser.explores
+
+
+
+#### snakify
+
+```python
+def snakify(s)
+```
+
+Convert a string to snake_case format.
+
+**Arguments**:
+
+- `s` _str_ - String to convert to snake_case.
+
+
+**Returns**:
+
+- `str` - Snake_case version of the input string.
+
+
+**Example**:
+
+ >>> snakify('MyViewName')
+ 'my_view_name'
+ >>> snakify('Order-Details')
+ 'order_details'
+
+
+
+#### build\_cube\_name\_look\_up
+
+```python
+def build_cube_name_look_up(cube_def)
+```
+
+Build a lookup dictionary for cube names in the cube definition.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Cube definition containing 'cubes' list.
+
+
+**Notes**:
+
+ modifies the cube_def dictionary in place by adding
+ a 'cube_name_look_up' key if it doesn't exist.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders'}, {'name': 'customers'}]}
+ >>> build_cube_name_look_up(cube_def)
+ >>> print('orders' in cube_def['cube_name_look_up'])
+ True
+
+
+
+#### get\_cube\_from\_cube\_def
+
+```python
+def get_cube_from_cube_def(cube_def, cube_name)
+```
+
+Get a cube definition by name from the cube definition.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Cube definition containing 'cubes' list.
+- `cube_name` _str_ - Name of the cube to retrieve.
+
+
+**Returns**:
+
+ dict | None: Cube definition if found, None otherwise.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> cube = get_cube_from_cube_def(cube_def, 'orders')
+ >>> print(cube['sql_table'])
+ 'orders'
+
+
+
+#### get\_cube\_names\_from\_join\_condition
+
+```python
+def get_cube_names_from_join_condition(join_condition)
+```
+
+Extract cube names from a join condition SQL string.
+
+**Arguments**:
+
+- `join_condition` _str_ - SQL join condition containing cube references.
+
+
+**Returns**:
+
+- `list[str]` - List of cube names found in the join condition.
+
+
+**Example**:
+
+ >>> join_condition = '${orders.customer_id} = ${customers.id}'
+ >>> get_cube_names_from_join_condition(join_condition)
+ ['orders', 'customers']
+
+
+
+#### traverse\_graph
+
+```python
+def traverse_graph(join_paths, cube_left, cube_right)
+```
+
+Find the shortest path between two cubes using BFS traversal.
+
+**Arguments**:
+
+- `join_paths` _dict_ - Dictionary mapping cube names to their connected cubes.
+- `cube_left` _str_ - Starting cube name.
+- `cube_right` _str_ - Target cube name.
+
+
+**Returns**:
+
+- `str` - Dot-separated path from cube_left to cube_right.
+
+
+**Example**:
+
+ >>> join_paths = {'orders': ['customers'], 'customers': ['addresses']}
+ >>> traverse_graph(join_paths, 'orders', 'addresses')
+ 'orders.customers.addresses'
+
+
+
+#### generate\_cube\_joins
+
+```python
+def generate_cube_joins(cube_def, lookml_model)
+```
+
+Generate cube join definitions from LookML explores.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Existing cube definition to modify.
+- `lookml_model` _dict_ - LookML model containing explores with joins.
+
+
+**Returns**:
+
+- `dict` - Updated cube definition with join information added to cubes.
+
+
+**Raises**:
+
+- `Exception` - If cube referenced in explores is not found.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders'}, {'name': 'customers'}]}
+ >>> lookml_model = {'explores': [{'joins': [{'name': 'customers', 'sql_on': '${orders.customer_id} = ${customers.id}', 'relationship': 'many_to_one'}]}]}
+ >>> updated_def = generate_cube_joins(cube_def, lookml_model)
+ >>> print(updated_def['cubes'][1]['joins'][0]['name'])
+ 'orders'
+
+
+
+#### generate\_cube\_views
+
+```python
+def generate_cube_views(cube_def, lookml_model, use_explores_name=False)
+```
+
+Generate Cube view definitions from LookML explores.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Cube definition to add views to.
+- `lookml_model` _dict_ - LookML model containing explores.
+- `use_explores_name` _bool, optional_ - Whether to use explore names as view names. Defaults to False.
+
+
+**Returns**:
+
+- `dict` - Updated cube definition with view definitions added.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders'}]}
+ >>> lookml_model = {'explores': [{'name': 'orders_explore', 'label': 'Orders Analysis'}]}
+ >>> updated_def = generate_cube_views(cube_def, lookml_model)
+ >>> print(updated_def['views'][0]['name'])
+ 'orders_analysis'
+
+
+
+#### parse\_explores
+
+```python
+def parse_explores(lookml_model, use_explores_name=False)
+```
+
+Parse LookML explores into Cube definitions with joins and views.
+
+**Arguments**:
+
+- `lookml_model` _dict_ - LookML model containing views and explores.
+- `use_explores_name` _bool, optional_ - Whether to use explore names as view names. Defaults to False.
+
+
+**Returns**:
+
+- `dict` - Complete cube definition with cubes, joins, and views.
+
+
+**Raises**:
+
+- `Exception` - If no explores are found in the LookML model.
+
+
+**Example**:
+
+ >>> lookml_model = {
+ ... 'views': [{'name': 'orders', 'sql_table_name': 'orders'}],
+ ... 'explores': [{'name': 'orders_explore', 'joins': [{'name': 'customers', 'sql_on': '${orders.customer_id} = ${customers.id}', 'relationship': 'many_to_one'}]}]
+ ... }
+ >>> cube_def = parse_explores(lookml_model)
+ >>> print(len(cube_def['cubes']))
+ 1
+ >>> print(len(cube_def['views']))
+ 1
+
diff --git a/docs/lkml2cube_parser_loader.md b/docs/lkml2cube_parser_loader.md
new file mode 100644
index 0000000..06c88f1
--- /dev/null
+++ b/docs/lkml2cube_parser_loader.md
@@ -0,0 +1,200 @@
+# Table of Contents
+
+* [lkml2cube.parser.loader](#lkml2cube.parser.loader)
+ * [update\_namespace](#lkml2cube.parser.loader.update_namespace)
+ * [file\_loader](#lkml2cube.parser.loader.file_loader)
+ * [write\_single\_file](#lkml2cube.parser.loader.write_single_file)
+ * [write\_files](#lkml2cube.parser.loader.write_files)
+ * [write\_lookml\_files](#lkml2cube.parser.loader.write_lookml_files)
+ * [print\_summary](#lkml2cube.parser.loader.print_summary)
+
+
+
+# lkml2cube.parser.loader
+
+
+
+#### update\_namespace
+
+```python
+def update_namespace(namespace, new_file)
+```
+
+Update namespace with new file content, merging lists and handling conflicts.
+
+**Arguments**:
+
+- `namespace` _dict | None_ - Existing namespace dictionary or None.
+- `new_file` _dict_ - New file content to merge into namespace.
+
+
+**Returns**:
+
+- `dict` - Updated namespace with merged content.
+
+
+**Example**:
+
+ >>> namespace = {'views': [{'name': 'view1'}]}
+ >>> new_file = {'views': [{'name': 'view2'}]}
+ >>> update_namespace(namespace, new_file)
+- `{'views'` - [{'name': 'view1'}, {'name': 'view2'}]}
+
+
+
+#### file\_loader
+
+```python
+def file_loader(file_path_input, rootdir_param, namespace=None)
+```
+
+Load LookML files and resolve includes recursively.
+
+**Arguments**:
+
+- `file_path_input` _str_ - File path pattern to load (supports glob patterns).
+- `rootdir_param` _str | None_ - Root directory for resolving includes.
+- `namespace` _dict | None_ - Existing namespace to merge content into.
+
+
+**Returns**:
+
+- `dict` - Loaded LookML model with resolved includes.
+
+
+**Raises**:
+
+- `FileNotFoundError` - If specified file path cannot be found.
+- `ValueError` - If LookML file cannot be parsed.
+
+
+**Example**:
+
+ >>> namespace = file_loader('models/*.lkml', '/path/to/root')
+ >>> print(namespace['views'][0]['name'])
+ 'my_view'
+
+
+
+#### write\_single\_file
+
+```python
+def write_single_file(cube_def: dict,
+ outputdir: str,
+ subdir: str = "cubes",
+ file_name: str = "my_cubes.yml")
+```
+
+Write a single cube definition to a YAML file.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Cube definition to write.
+- `outputdir` _str_ - Output directory path.
+- `subdir` _str, optional_ - Subdirectory within output directory. Defaults to "cubes".
+- `file_name` _str, optional_ - Name of the output file. Defaults to "my_cubes.yml".
+
+
+**Raises**:
+
+- `OSError` - If output directory cannot be created or file cannot be written.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> write_single_file(cube_def, '/output', 'cubes', 'orders.yml')
+
+
+
+#### write\_files
+
+```python
+def write_files(cube_def, outputdir)
+```
+
+Write cube definitions to separate files organized by type.
+
+**Arguments**:
+
+- `cube_def` _dict_ - Cube definitions containing 'cubes' and/or 'views' keys.
+- `outputdir` _str_ - Output directory path.
+
+
+**Returns**:
+
+- `dict` - Summary of written files with structure:
+- `{'cubes'` - [{'name': str, 'path': str}], 'views': [{'name': str, 'path': str}]}
+
+
+**Raises**:
+
+- `Exception` - If no cube definition is provided.
+- `OSError` - If output directory cannot be created or files cannot be written.
+
+
+**Example**:
+
+ >>> cube_def = {'cubes': [{'name': 'orders'}], 'views': [{'name': 'orders_view'}]}
+ >>> summary = write_files(cube_def, '/output')
+ >>> print(summary['cubes'][0]['name'])
+ 'orders'
+
+
+
+#### write\_lookml\_files
+
+```python
+def write_lookml_files(lookml_model, outputdir)
+```
+
+Write LookML model to files in the output directory.
+
+**Arguments**:
+
+- `lookml_model` _dict_ - LookML model containing 'views' and/or 'explores' keys.
+- `outputdir` _str_ - Output directory path.
+
+
+**Returns**:
+
+- `dict` - Summary of written files with structure:
+- `{'views'` - [{'name': str, 'path': str}], 'explores': [{'name': str, 'path': str}]}
+
+
+**Raises**:
+
+- `Exception` - If no LookML model is provided.
+- `OSError` - If output directory cannot be created or files cannot be written.
+
+
+**Example**:
+
+ >>> lookml_model = {'views': [{'name': 'orders'}], 'explores': [{'name': 'orders_explore'}]}
+ >>> summary = write_lookml_files(lookml_model, '/output')
+ >>> print(summary['views'][0]['name'])
+ 'orders'
+
+
+
+#### print\_summary
+
+```python
+def print_summary(summary)
+```
+
+Print a formatted summary of generated files using Rich tables.
+
+**Arguments**:
+
+- `summary` _dict_ - Summary dictionary containing file information with keys
+ 'cubes', 'views', and/or 'explores', each containing lists of
+- `{'name'` - str, 'path': str} dictionaries.
+
+
+**Example**:
+
+ >>> summary = {'cubes': [{'name': 'orders', 'path': '/output/cubes/orders.yml'}]}
+ >>> print_summary(summary)
+ # Displays a formatted table showing the generated files
+
diff --git a/docs/lkml2cube_parser_types.md b/docs/lkml2cube_parser_types.md
new file mode 100644
index 0000000..f2b8f38
--- /dev/null
+++ b/docs/lkml2cube_parser_types.md
@@ -0,0 +1,149 @@
+# Table of Contents
+
+* [lkml2cube.parser.types](#lkml2cube.parser.types)
+ * [Console](#lkml2cube.parser.types.Console)
+ * [print](#lkml2cube.parser.types.Console.print)
+ * [folded\_unicode](#lkml2cube.parser.types.folded_unicode)
+ * [literal\_unicode](#lkml2cube.parser.types.literal_unicode)
+ * [folded\_unicode\_representer](#lkml2cube.parser.types.folded_unicode_representer)
+ * [literal\_unicode\_representer](#lkml2cube.parser.types.literal_unicode_representer)
+
+
+
+# lkml2cube.parser.types
+
+
+
+## Console Objects
+
+```python
+class Console()
+```
+
+Simple console wrapper for printing messages.
+
+This class provides a basic print interface compatible with Rich console
+while falling back to standard print functionality.
+
+
+
+#### print
+
+```python
+def print(s, *args)
+```
+
+Print a message to the console.
+
+**Arguments**:
+
+- `s` _str_ - Message to print.
+- `*args` - Additional arguments (currently ignored).
+
+
+**Example**:
+
+ >>> console = Console()
+ >>> console.print("Hello world", style="bold")
+ Hello world
+
+
+
+## folded\_unicode Objects
+
+```python
+class folded_unicode(str)
+```
+
+String subclass for YAML folded scalar representation.
+
+This class marks strings that should be represented as folded scalars
+in YAML output (using the '>' style).
+
+**Example**:
+
+ >>> text = folded_unicode("This is a long
+ string that will be folded")
+ >>> # When dumped to YAML, will use '>' style
+
+
+
+## literal\_unicode Objects
+
+```python
+class literal_unicode(str)
+```
+
+String subclass for YAML literal scalar representation.
+
+This class marks strings that should be represented as literal scalars
+in YAML output (using the '|' style).
+
+**Example**:
+
+ >>> sql = literal_unicode("SELECT *
+ FROM table
+ WHERE id = 1")
+ >>> # When dumped to YAML, will use '|' style preserving line breaks
+
+
+
+#### folded\_unicode\_representer
+
+```python
+def folded_unicode_representer(dumper, data)
+```
+
+YAML representer for folded_unicode strings.
+
+**Arguments**:
+
+- `dumper` - YAML dumper instance.
+- `data` _folded_unicode_ - String data to represent.
+
+
+**Returns**:
+
+ Scalar representation with folded style.
+
+
+**Example**:
+
+ >>> import yaml
+ >>> yaml.add_representer(folded_unicode, folded_unicode_representer)
+ >>> yaml.dump(folded_unicode("long text"))
+ '> long text
+ '
+
+
+
+#### literal\_unicode\_representer
+
+```python
+def literal_unicode_representer(dumper, data)
+```
+
+YAML representer for literal_unicode strings.
+
+**Arguments**:
+
+- `dumper` - YAML dumper instance.
+- `data` _literal_unicode_ - String data to represent.
+
+
+**Returns**:
+
+ Scalar representation with literal style.
+
+
+**Example**:
+
+ >>> import yaml
+ >>> yaml.add_representer(literal_unicode, literal_unicode_representer)
+ >>> yaml.dump(literal_unicode("SELECT *
+ FROM table"))
+ '|
+ SELECT *
+ FROM table
+ '
+
diff --git a/docs/lkml2cube_parser_views.md b/docs/lkml2cube_parser_views.md
new file mode 100644
index 0000000..870f403
--- /dev/null
+++ b/docs/lkml2cube_parser_views.md
@@ -0,0 +1,55 @@
+# Table of Contents
+
+* [lkml2cube.parser.views](#lkml2cube.parser.views)
+ * [parse\_view](#lkml2cube.parser.views.parse_view)
+
+
+
+# lkml2cube.parser.views
+
+
+
+#### parse\_view
+
+```python
+def parse_view(lookml_model, raise_when_views_not_present=True)
+```
+
+Parse LookML views into Cube definitions.
+
+Converts LookML view definitions into Cube format, handling dimensions, measures,
+view inheritance, and various LookML-specific features like tiers and drill fields.
+
+**Arguments**:
+
+- `lookml_model` _dict_ - LookML model containing views to parse.
+- `raise_when_views_not_present` _bool, optional_ - Whether to raise an exception
+ when no views are found. Defaults to True.
+
+
+**Returns**:
+
+- `dict` - Cube definitions with structure:
+- `{'cubes'` - [{'name': str, 'description': str, 'dimensions': list, 'measures': list, 'joins': list}]}
+
+
+**Raises**:
+
+- `Exception` - If raise_when_views_not_present is True and no views are found,
+ or if required dimension properties are missing.
+
+
+**Example**:
+
+ >>> lookml_model = {
+ ... 'views': [{
+ ... 'name': 'orders',
+ ... 'sql_table_name': 'public.orders',
+ ... 'dimensions': [{'name': 'id', 'type': 'number', 'sql': '${TABLE}.id'}],
+ ... 'measures': [{'name': 'count', 'type': 'count'}]
+ ... }]
+ ... }
+ >>> cube_def = parse_view(lookml_model)
+ >>> print(cube_def['cubes'][0]['name'])
+ 'orders'
+
diff --git a/lkml2cube/converter.py b/lkml2cube/converter.py
new file mode 100644
index 0000000..4175e89
--- /dev/null
+++ b/lkml2cube/converter.py
@@ -0,0 +1,329 @@
+"""
+Main converter class for lkml2cube providing a Python API for LookML to Cube conversion.
+
+This module provides a high-level interface for converting LookML models to Cube definitions
+without requiring CLI usage. It maintains configuration state and provides the same
+functionality as the CLI commands.
+"""
+
+import pprint
+import yaml
+from typing import Optional, Dict, Any, List
+
+from lkml2cube.parser.cube_api import meta_loader, parse_meta
+from lkml2cube.parser.explores import parse_explores, generate_cube_joins
+from lkml2cube.parser.loader import file_loader, write_files, write_lookml_files, print_summary
+from lkml2cube.parser.views import parse_view
+from lkml2cube.parser.types import (
+ folded_unicode,
+ literal_unicode,
+ folded_unicode_representer,
+ literal_unicode_representer,
+ console,
+)
+
+
+class LookMLConverter:
+ """Main converter class for LookML to Cube conversion operations.
+
+ This class provides a Python API for converting LookML models to Cube definitions,
+ maintaining configuration state and providing the same functionality as the CLI commands.
+
+ Attributes:
+ outputdir (str): Directory where output files will be written.
+ rootdir (str | None): Root directory for resolving LookML includes.
+ parseonly (bool): If True, only parse and return Python dict representation.
+ printonly (bool): If True, print YAML output to stdout instead of writing files.
+ use_explores_name (bool): Whether to use explore names for cube view names.
+ """
+
+ def __init__(
+ self,
+ outputdir: str = ".",
+ rootdir: Optional[str] = None,
+ parseonly: bool = False,
+ printonly: bool = False,
+ use_explores_name: bool = False,
+ ):
+ """Initialize the LookML converter with configuration options.
+
+ Args:
+ outputdir (str, optional): Directory where output files will be written. Defaults to ".".
+ rootdir (str | None, optional): Root directory for resolving LookML includes. Defaults to None.
+ parseonly (bool, optional): If True, only parse and return Python dict representation. Defaults to False.
+ printonly (bool, optional): If True, print YAML output to stdout instead of writing files. Defaults to False.
+ use_explores_name (bool, optional): Whether to use explore names for cube view names. Defaults to False.
+
+ Example:
+ >>> converter = LookMLConverter(outputdir="/tmp/output", rootdir="/lookml/models")
+ >>> result = converter.cubes("models/*.lkml")
+ >>> print(result['summary']['cubes'][0]['name'])
+ 'orders'
+ """
+ self.outputdir = outputdir
+ self.rootdir = rootdir
+ self.parseonly = parseonly
+ self.printonly = printonly
+ self.use_explores_name = use_explores_name
+
+ # Configure YAML representers for proper formatting
+ yaml.add_representer(folded_unicode, folded_unicode_representer)
+ yaml.add_representer(literal_unicode, literal_unicode_representer)
+
+ def cubes(self, file_path: str) -> Dict[str, Any]:
+ """Generate cube definitions from LookML views.
+
+ Converts LookML views into Cube cube definitions, handling dimensions, measures,
+ and basic join relationships.
+
+ Args:
+ file_path (str): Path to LookML file(s) to process (supports glob patterns).
+
+ Returns:
+ dict: Result dictionary containing:
+ - 'lookml_model': Parsed LookML model (if parseonly=True)
+ - 'cube_def': Generated cube definitions
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+ Raises:
+ ValueError: If no files are found at the specified path.
+
+ Example:
+ >>> converter = LookMLConverter()
+ >>> result = converter.cubes("models/orders.lkml")
+ >>> print(result['cube_def']['cubes'][0]['name'])
+ 'orders'
+ """
+ lookml_model = file_loader(file_path, self.rootdir)
+
+ if lookml_model is None:
+ raise ValueError(f"No files were found on path: {file_path}")
+
+ result = {'lookml_model': lookml_model}
+
+ if self.parseonly:
+ result['parsed_model'] = pprint.pformat(lookml_model)
+ return result
+
+ cube_def = parse_view(lookml_model)
+ cube_def = generate_cube_joins(cube_def, lookml_model)
+ result['cube_def'] = cube_def
+
+ if self.printonly:
+ yaml_output = yaml.dump(cube_def, allow_unicode=True)
+ result['yaml_output'] = yaml_output
+ console.print(yaml_output)
+ return result
+
+ summary = write_files(cube_def, outputdir=self.outputdir)
+ result['summary'] = summary
+ print_summary(summary)
+
+ return result
+
+ def views(self, file_path: str) -> Dict[str, Any]:
+ """Generate cube definitions with views from LookML explores.
+
+ Converts LookML explores into Cube definitions including both cubes and views
+ with join relationships.
+
+ Args:
+ file_path (str): Path to LookML file(s) to process (supports glob patterns).
+
+ Returns:
+ dict: Result dictionary containing:
+ - 'lookml_model': Parsed LookML model (if parseonly=True)
+ - 'cube_def': Generated cube definitions with views
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+ Raises:
+ ValueError: If no files are found at the specified path.
+
+ Example:
+ >>> converter = LookMLConverter(use_explores_name=True)
+ >>> result = converter.views("models/explores.lkml")
+ >>> print(len(result['cube_def']['views']))
+ 2
+ """
+ lookml_model = file_loader(file_path, self.rootdir)
+
+ if lookml_model is None:
+ raise ValueError(f"No files were found on path: {file_path}")
+
+ result = {'lookml_model': lookml_model}
+
+ if self.parseonly:
+ result['parsed_model'] = pprint.pformat(lookml_model)
+ return result
+
+ cube_def = parse_explores(lookml_model, self.use_explores_name)
+ result['cube_def'] = cube_def
+
+ if self.printonly:
+ yaml_output = yaml.dump(cube_def, allow_unicode=True)
+ result['yaml_output'] = yaml_output
+ console.print(yaml_output)
+ return result
+
+ summary = write_files(cube_def, outputdir=self.outputdir)
+ result['summary'] = summary
+ print_summary(summary)
+
+ return result
+
+ def explores(self, metaurl: str, token: str) -> Dict[str, Any]:
+ """Generate LookML explores from Cube meta API.
+
+ Fetches Cube model from meta API and converts it to LookML explores,
+ correctly mapping Cube cubes to LookML views and Cube views to LookML explores.
+
+ Args:
+ metaurl (str): URL to the Cube meta API endpoint.
+ token (str): JWT token for Cube meta API authentication.
+
+ Returns:
+ dict: Result dictionary containing:
+ - 'cube_model': Raw Cube model from meta API (if parseonly=True)
+ - 'lookml_model': Converted LookML model
+ - 'yaml_output': YAML string representation (if printonly=True)
+ - 'summary': File generation summary (if files written)
+
+ Raises:
+ ValueError: If no response is received from the meta API.
+ Exception: If API request fails or token is invalid.
+
+ Example:
+ >>> converter = LookMLConverter(outputdir="/tmp/lookml")
+ >>> result = converter.explores("https://api.cube.dev/v1/meta", "jwt-token")
+ >>> print(len(result['lookml_model']['explores']))
+ 3
+ """
+ cube_model = meta_loader(meta_url=metaurl, token=token)
+
+ if cube_model is None:
+ raise ValueError(f"No response received from: {metaurl}")
+
+ result = {'cube_model': cube_model}
+
+ if self.parseonly:
+ result['parsed_model'] = pprint.pformat(cube_model)
+ return result
+
+ lookml_model = parse_meta(cube_model)
+ result['lookml_model'] = lookml_model
+
+ if self.printonly:
+ yaml_output = yaml.dump(lookml_model, allow_unicode=True)
+ result['yaml_output'] = yaml_output
+ console.print(yaml_output)
+ return result
+
+ summary = write_lookml_files(lookml_model, outputdir=self.outputdir)
+ result['summary'] = summary
+ print_summary(summary)
+
+ return result
+
+ def set_config(
+ self,
+ outputdir: Optional[str] = None,
+ rootdir: Optional[str] = None,
+ parseonly: Optional[bool] = None,
+ printonly: Optional[bool] = None,
+ use_explores_name: Optional[bool] = None,
+ ) -> None:
+ """Update converter configuration options.
+
+ Args:
+ outputdir (str | None, optional): Directory where output files will be written.
+ rootdir (str | None, optional): Root directory for resolving LookML includes.
+ parseonly (bool | None, optional): If True, only parse and return Python dict representation.
+ printonly (bool | None, optional): If True, print YAML output to stdout instead of writing files.
+ use_explores_name (bool | None, optional): Whether to use explore names for cube view names.
+
+ Example:
+ >>> converter = LookMLConverter()
+ >>> converter.set_config(outputdir="/new/path", parseonly=True)
+ >>> result = converter.cubes("models/*.lkml")
+ # Will now parse only and use the new output directory
+ """
+ if outputdir is not None:
+ self.outputdir = outputdir
+ if rootdir is not None:
+ self.rootdir = rootdir
+ if parseonly is not None:
+ self.parseonly = parseonly
+ if printonly is not None:
+ self.printonly = printonly
+ if use_explores_name is not None:
+ self.use_explores_name = use_explores_name
+
+ def get_config(self) -> Dict[str, Any]:
+ """Get current converter configuration.
+
+ Returns:
+ dict: Current configuration settings.
+
+ Example:
+ >>> converter = LookMLConverter(outputdir="/tmp")
+ >>> config = converter.get_config()
+ >>> print(config['outputdir'])
+ '/tmp'
+ """
+ return {
+ 'outputdir': self.outputdir,
+ 'rootdir': self.rootdir,
+ 'parseonly': self.parseonly,
+ 'printonly': self.printonly,
+ 'use_explores_name': self.use_explores_name,
+ }
+
+ def validate_files(self, file_paths: List[str]) -> Dict[str, bool]:
+ """Validate that LookML files exist and can be loaded.
+
+ Args:
+ file_paths (list[str]): List of file paths to validate.
+
+ Returns:
+ dict: Dictionary mapping file paths to validation results.
+
+ Example:
+ >>> converter = LookMLConverter()
+ >>> results = converter.validate_files(["models/orders.lkml", "models/missing.lkml"])
+ >>> print(results["models/orders.lkml"])
+ True
+ """
+ results = {}
+ for file_path in file_paths:
+ try:
+ lookml_model = file_loader(file_path, self.rootdir)
+ results[file_path] = lookml_model is not None
+ except Exception:
+ results[file_path] = False
+ return results
+
+ def clear_cache(self) -> None:
+ """Clear the global file loader cache.
+
+ This method clears the visited_path cache used by the file_loader to prevent
+ circular includes. Useful for ensuring clean state between operations or
+ in testing scenarios.
+
+ Example:
+ >>> converter = LookMLConverter()
+ >>> converter.cubes("models/orders.lkml") # Populates cache
+ >>> converter.clear_cache() # Clears cache
+ >>> converter.cubes("models/orders.lkml") # Loads fresh from disk
+ """
+ from lkml2cube.parser import loader
+ loader.visited_path.clear()
+
+ def __repr__(self) -> str:
+ """Return string representation of the converter."""
+ return (
+ f"LookMLConverter(outputdir='{self.outputdir}', "
+ f"rootdir='{self.rootdir}', parseonly={self.parseonly}, "
+ f"printonly={self.printonly}, use_explores_name={self.use_explores_name})"
+ )
\ No newline at end of file
diff --git a/lkml2cube/main.py b/lkml2cube/main.py
index daa60b2..95e0724 100644
--- a/lkml2cube/main.py
+++ b/lkml2cube/main.py
@@ -1,3 +1,15 @@
+"""
+Main CLI module for lkml2cube - LookML to Cube bidirectional converter.
+
+This module provides the command-line interface for lkml2cube, offering three main commands:
+- cubes: Convert LookML views to Cube definitions
+- views: Convert LookML explores to Cube definitions with views
+- explores: Generate LookML explores from Cube meta API
+
+The CLI is built using Typer and provides rich console output with proper error handling.
+Each command supports various options for parsing, output formatting, and file generation.
+"""
+
import pprint
import rich
import typer
@@ -23,8 +35,20 @@
@app.callback()
def callback():
- """
- lkml2cube is a tool to convert LookML models into Cube data models.
+ """Main callback function for the lkml2cube CLI application.
+
+ This function serves as the entry point for the CLI and provides
+ general information about the tool. It sets up the global context
+ for all subcommands.
+
+ Note:
+ This function is called automatically by Typer when the CLI is invoked.
+ It doesn't perform any specific actions but serves as a placeholder
+ for global CLI configuration.
+
+ Example:
+ $ lkml2cube --help
+ # Shows help information for the entire CLI
"""
# console.print(("lkml2cube is a tool to convert LookML models into Cube data models.\n"
# "Use lkml2cube --help to see usage."))
@@ -53,8 +77,31 @@ def cubes(
str, typer.Option(help="The path to prepend to include paths")
] = None,
):
- """
- Generate cubes-only given a LookML file that contains LookML Views.
+ """Generate Cube model definitions from LookML view files.
+
+ Converts LookML view files into Cube YAML definitions, handling dimensions,
+ measures, and basic join relationships. This command focuses on generating
+ cube definitions only (no views).
+
+ Args:
+ file_path (str): Path to the LookML file to process (supports glob patterns).
+ parseonly (bool, optional): If True, only displays the parsed LookML as Python dict. Defaults to False.
+ outputdir (str, optional): Directory where output files will be written. Defaults to ".".
+ printonly (bool, optional): If True, prints YAML to stdout instead of writing files. Defaults to False.
+ rootdir (str | None, optional): Root directory for resolving LookML includes. Defaults to None.
+
+ Raises:
+ typer.Exit: If no files are found at the specified path.
+
+ Example:
+ $ lkml2cube cubes models/orders.view.lkml --outputdir output/
+ # Generates cube definitions in output/cubes/
+
+ $ lkml2cube cubes models/orders.view.lkml --parseonly
+ # Shows parsed LookML structure
+
+ $ lkml2cube cubes models/orders.view.lkml --printonly
+ # Prints YAML to console
"""
lookml_model = file_loader(file_path, rootdir)
@@ -103,8 +150,32 @@ def views(
bool, typer.Option(help="Use explore names for cube view names")
] = False,
):
- """
- Generate cubes-only given a LookML file that contains LookML Views.
+ """Generate Cube model definitions with views from LookML explore files.
+
+ Converts LookML explore files into Cube YAML definitions, creating both
+ cube definitions and view definitions with join relationships. This command
+ generates a complete Cube model with views that define how cubes relate to each other.
+
+ Args:
+ file_path (str): Path to the LookML explore file to process (supports glob patterns).
+ parseonly (bool, optional): If True, only displays the parsed LookML as Python dict. Defaults to False.
+ outputdir (str, optional): Directory where output files will be written. Defaults to ".".
+ printonly (bool, optional): If True, prints YAML to stdout instead of writing files. Defaults to False.
+ rootdir (str | None, optional): Root directory for resolving LookML includes. Defaults to None.
+ use_explores_name (bool, optional): If True, uses explore names for cube view names. Defaults to False.
+
+ Raises:
+ typer.Exit: If no files are found at the specified path.
+
+ Example:
+ $ lkml2cube views models/explores.lkml --outputdir output/
+ # Generates cubes and views in output/cubes/ and output/views/
+
+ $ lkml2cube views models/explores.lkml --use-explores-name
+ # Uses explore names for view naming
+
+ $ lkml2cube views models/explores.lkml --parseonly
+ # Shows parsed LookML structure
"""
lookml_model = file_loader(file_path, rootdir)
@@ -147,8 +218,33 @@ def explores(
bool, typer.Option(help="Print to stdout the parsed files")
] = False,
):
- """
- Generate cubes-only given a LookML file that contains LookML Views.
+ """Generate LookML explores and views from Cube meta API.
+
+ Fetches Cube model definitions from the meta API and converts them to
+ production-ready LookML files. This command correctly maps:
+ - Cube cubes (with sql_table/sql) → LookML views
+ - Cube views (with aliasMember joins) → LookML explores
+
+ Args:
+ metaurl (str): URL to the Cube meta API endpoint (e.g., https://api.cube.dev/v1/meta).
+ token (str): JWT authentication token for the Cube meta API.
+ parseonly (bool, optional): If True, only displays the parsed Cube model as Python dict. Defaults to False.
+ outputdir (str, optional): Directory where output files will be written. Defaults to ".".
+ printonly (bool, optional): If True, prints YAML to stdout instead of writing files. Defaults to False.
+
+ Raises:
+ typer.Exit: If no response is received from the meta API.
+ ValueError: If the token is invalid or API request fails.
+
+ Example:
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --outputdir lookml/
+ # Generates LookML views and explores in lookml/views/ and lookml/explores/
+
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --parseonly
+ # Shows parsed Cube model structure
+
+ $ lkml2cube explores "https://api.cube.dev/v1/meta" --token "jwt-token" --printonly
+ # Prints generated LookML to console
"""
cube_model = meta_loader(
@@ -175,4 +271,9 @@ def explores(
if __name__ == "__main__":
+ """Entry point for the CLI application when run as a script.
+
+ This block is executed when the module is run directly (not imported).
+ It starts the Typer application which handles command parsing and routing.
+ """
app()
diff --git a/lkml2cube/parser/cube_api.py b/lkml2cube/parser/cube_api.py
index ed5d87d..732788d 100644
--- a/lkml2cube/parser/cube_api.py
+++ b/lkml2cube/parser/cube_api.py
@@ -6,8 +6,23 @@ def meta_loader(
meta_url: str,
token: str,
) -> dict:
- """
- Load the Cube meta API and return the model as a dictionary.
+ """Load the Cube meta API and return the model as a dictionary.
+
+ Args:
+ meta_url (str): URL to the Cube meta API endpoint.
+ token (str): Authentication token for the API.
+
+ Returns:
+ dict: Cube model data from the meta API.
+
+ Raises:
+ ValueError: If no valid token is provided.
+ Exception: If the API request fails or returns non-200 status.
+
+ Example:
+ >>> model = meta_loader('https://api.cube.dev/v1/meta', 'my-token')
+ >>> print(model['cubes'][0]['name'])
+ 'orders'
"""
if not token:
@@ -27,8 +42,19 @@ def meta_loader(
def parse_members(members: list) -> list:
- """
- Parse measures and dimensions from the Cube meta model.
+ """Parse measures and dimensions from the Cube meta model.
+
+ Args:
+ members (list): List of dimension or measure definitions from Cube meta.
+
+ Returns:
+ list: List of parsed members in LookML format.
+
+ Example:
+ >>> members = [{'name': 'total_sales', 'type': 'sum', 'sql': 'amount'}]
+ >>> parsed = parse_members(members)
+ >>> print(parsed[0]['name'])
+ 'total_sales'
"""
rpl_table = (
@@ -64,9 +90,22 @@ def parse_members(members: list) -> list:
def parse_meta(cube_model: dict) -> dict:
- """
- Parse the Cube meta model and return a simplified version.
+ """Parse the Cube meta model and return a simplified version.
+
Separates Cube cubes (-> LookML views) from Cube views (-> LookML explores).
+
+ Args:
+ cube_model (dict): Complete Cube model from meta API.
+
+ Returns:
+ dict: LookML model with structure:
+ {'views': list, 'explores': list}
+
+ Example:
+ >>> cube_model = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> lookml_model = parse_meta(cube_model)
+ >>> print(lookml_model['views'][0]['name'])
+ 'orders'
"""
lookml_model = {
@@ -91,9 +130,23 @@ def parse_meta(cube_model: dict) -> dict:
def _is_cube_view(model: dict) -> bool:
- """
- Determine if a Cube model is a view (has joins) or a cube (has its own data source).
+ """Determine if a Cube model is a view (has joins) or a cube (has its own data source).
+
Views typically have aliasMember references and no sql_table/sql property.
+
+ Args:
+ model (dict): Cube model definition.
+
+ Returns:
+ bool: True if the model is a view, False if it's a cube.
+
+ Example:
+ >>> model = {'dimensions': [{'aliasMember': 'orders.id'}]}
+ >>> _is_cube_view(model)
+ True
+ >>> model = {'sql_table': 'orders', 'dimensions': [{'name': 'id'}]}
+ >>> _is_cube_view(model)
+ False
"""
# Check if any dimensions or measures use aliasMember (indicating joins)
has_alias_members = False
@@ -116,8 +169,21 @@ def _is_cube_view(model: dict) -> bool:
def _parse_cube_to_view(model: dict) -> dict:
- """
- Parse a Cube cube into a LookML view.
+ """Parse a Cube cube into a LookML view.
+
+ Args:
+ model (dict): Cube model definition.
+
+ Returns:
+ dict: LookML view definition with dimensions, measures, and metadata.
+
+ Example:
+ >>> model = {'name': 'orders', 'sql_table': 'orders', 'dimensions': [{'name': 'id', 'type': 'string'}]}
+ >>> view = _parse_cube_to_view(model)
+ >>> print(view['name'])
+ 'orders'
+ >>> print(view['sql_table_name'])
+ 'orders'
"""
view = {
"name": model.get("name"),
@@ -146,8 +212,24 @@ def _parse_cube_to_view(model: dict) -> dict:
def _parse_cube_view_to_explore(model: dict) -> dict:
- """
- Parse a Cube view into a LookML explore with joins.
+ """Parse a Cube view into a LookML explore with joins.
+
+ Args:
+ model (dict): Cube view model definition with aliasMember references.
+
+ Returns:
+ dict: LookML explore definition with joins based on referenced cubes.
+
+ Example:
+ >>> model = {
+ ... 'name': 'orders_analysis',
+ ... 'dimensions': [{'aliasMember': 'orders.id'}, {'aliasMember': 'customers.name'}]
+ ... }
+ >>> explore = _parse_cube_view_to_explore(model)
+ >>> print(explore['name'])
+ 'orders_analysis'
+ >>> print(len(explore['joins']))
+ 1
"""
explore = {
"name": model.get("name"),
diff --git a/lkml2cube/parser/explores.py b/lkml2cube/parser/explores.py
index 2cc875c..724fa33 100644
--- a/lkml2cube/parser/explores.py
+++ b/lkml2cube/parser/explores.py
@@ -10,6 +10,20 @@
def snakify(s):
+ """Convert a string to snake_case format.
+
+ Args:
+ s (str): String to convert to snake_case.
+
+ Returns:
+ str: Snake_case version of the input string.
+
+ Example:
+ >>> snakify('MyViewName')
+ 'my_view_name'
+ >>> snakify('Order-Details')
+ 'order_details'
+ """
return "_".join(
re.sub(
"([A-Z][a-z]+)", r" \1", re.sub("([A-Z]+)", r" \1", s.replace("-", " "))
@@ -18,6 +32,21 @@ def snakify(s):
def build_cube_name_look_up(cube_def):
+ """Build a lookup dictionary for cube names in the cube definition.
+
+ Args:
+ cube_def (dict): Cube definition containing 'cubes' list.
+
+ Note:
+ This function modifies the cube_def dictionary in place by adding
+ a 'cube_name_look_up' key if it doesn't exist.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders'}, {'name': 'customers'}]}
+ >>> build_cube_name_look_up(cube_def)
+ >>> print('orders' in cube_def['cube_name_look_up'])
+ True
+ """
if "cube_name_look_up" in cube_def:
return
cube_name_look_up = {}
@@ -27,6 +56,21 @@ def build_cube_name_look_up(cube_def):
def get_cube_from_cube_def(cube_def, cube_name):
+ """Get a cube definition by name from the cube definition.
+
+ Args:
+ cube_def (dict): Cube definition containing 'cubes' list.
+ cube_name (str): Name of the cube to retrieve.
+
+ Returns:
+ dict | None: Cube definition if found, None otherwise.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> cube = get_cube_from_cube_def(cube_def, 'orders')
+ >>> print(cube['sql_table'])
+ 'orders'
+ """
if "cube_name_look_up" not in cube_def:
build_cube_name_look_up(cube_def)
if cube_name in cube_def["cube_name_look_up"]:
@@ -35,10 +79,38 @@ def get_cube_from_cube_def(cube_def, cube_name):
def get_cube_names_from_join_condition(join_condition):
+ """Extract cube names from a join condition SQL string.
+
+ Args:
+ join_condition (str): SQL join condition containing cube references.
+
+ Returns:
+ list[str]: List of cube names found in the join condition.
+
+ Example:
+ >>> join_condition = '${orders.customer_id} = ${customers.id}'
+ >>> get_cube_names_from_join_condition(join_condition)
+ ['orders', 'customers']
+ """
return [cube.split(".")[0] for cube in re.findall(snake_case, join_condition)]
def traverse_graph(join_paths, cube_left, cube_right):
+ """Find the shortest path between two cubes using BFS traversal.
+
+ Args:
+ join_paths (dict): Dictionary mapping cube names to their connected cubes.
+ cube_left (str): Starting cube name.
+ cube_right (str): Target cube name.
+
+ Returns:
+ str: Dot-separated path from cube_left to cube_right.
+
+ Example:
+ >>> join_paths = {'orders': ['customers'], 'customers': ['addresses']}
+ >>> traverse_graph(join_paths, 'orders', 'addresses')
+ 'orders.customers.addresses'
+ """
# Create a queue for BFS
queue = []
queue.append([cube_left])
@@ -66,6 +138,25 @@ def traverse_graph(join_paths, cube_left, cube_right):
def generate_cube_joins(cube_def, lookml_model):
+ """Generate cube join definitions from LookML explores.
+
+ Args:
+ cube_def (dict): Existing cube definition to modify.
+ lookml_model (dict): LookML model containing explores with joins.
+
+ Returns:
+ dict: Updated cube definition with join information added to cubes.
+
+ Raises:
+ Exception: If cube referenced in explores is not found.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders'}, {'name': 'customers'}]}
+ >>> lookml_model = {'explores': [{'joins': [{'name': 'customers', 'sql_on': '${orders.customer_id} = ${customers.id}', 'relationship': 'many_to_one'}]}]}
+ >>> updated_def = generate_cube_joins(cube_def, lookml_model)
+ >>> print(updated_def['cubes'][1]['joins'][0]['name'])
+ 'orders'
+ """
if "explores" not in lookml_model or not lookml_model["explores"]:
return cube_def
for explore in lookml_model["explores"]:
@@ -122,6 +213,23 @@ def generate_cube_joins(cube_def, lookml_model):
def generate_cube_views(cube_def, lookml_model, use_explores_name=False):
+ """Generate Cube view definitions from LookML explores.
+
+ Args:
+ cube_def (dict): Cube definition to add views to.
+ lookml_model (dict): LookML model containing explores.
+ use_explores_name (bool, optional): Whether to use explore names as view names. Defaults to False.
+
+ Returns:
+ dict: Updated cube definition with view definitions added.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders'}]}
+ >>> lookml_model = {'explores': [{'name': 'orders_explore', 'label': 'Orders Analysis'}]}
+ >>> updated_def = generate_cube_views(cube_def, lookml_model)
+ >>> print(updated_def['views'][0]['name'])
+ 'orders_analysis'
+ """
if "views" not in cube_def:
cube_def["views"] = []
if "explores" not in lookml_model or not lookml_model["explores"]:
@@ -184,6 +292,29 @@ def generate_cube_views(cube_def, lookml_model, use_explores_name=False):
def parse_explores(lookml_model, use_explores_name=False):
+ """Parse LookML explores into Cube definitions with joins and views.
+
+ Args:
+ lookml_model (dict): LookML model containing views and explores.
+ use_explores_name (bool, optional): Whether to use explore names as view names. Defaults to False.
+
+ Returns:
+ dict: Complete cube definition with cubes, joins, and views.
+
+ Raises:
+ Exception: If no explores are found in the LookML model.
+
+ Example:
+ >>> lookml_model = {
+ ... 'views': [{'name': 'orders', 'sql_table_name': 'orders'}],
+ ... 'explores': [{'name': 'orders_explore', 'joins': [{'name': 'customers', 'sql_on': '${orders.customer_id} = ${customers.id}', 'relationship': 'many_to_one'}]}]
+ ... }
+ >>> cube_def = parse_explores(lookml_model)
+ >>> print(len(cube_def['cubes']))
+ 1
+ >>> print(len(cube_def['views']))
+ 1
+ """
# First we read all possible lookml views.
cube_def = parse_view(lookml_model, raise_when_views_not_present=False)
if "explores" not in lookml_model:
diff --git a/lkml2cube/parser/loader.py b/lkml2cube/parser/loader.py
index 6833581..9b1a8e5 100644
--- a/lkml2cube/parser/loader.py
+++ b/lkml2cube/parser/loader.py
@@ -14,6 +14,21 @@
def update_namespace(namespace, new_file):
+ """Update namespace with new file content, merging lists and handling conflicts.
+
+ Args:
+ namespace (dict | None): Existing namespace dictionary or None.
+ new_file (dict): New file content to merge into namespace.
+
+ Returns:
+ dict: Updated namespace with merged content.
+
+ Example:
+ >>> namespace = {'views': [{'name': 'view1'}]}
+ >>> new_file = {'views': [{'name': 'view2'}]}
+ >>> update_namespace(namespace, new_file)
+ {'views': [{'name': 'view1'}, {'name': 'view2'}]}
+ """
if namespace is None:
return new_file
@@ -32,6 +47,25 @@ def update_namespace(namespace, new_file):
def file_loader(file_path_input, rootdir_param, namespace=None):
+ """Load LookML files and resolve includes recursively.
+
+ Args:
+ file_path_input (str): File path pattern to load (supports glob patterns).
+ rootdir_param (str | None): Root directory for resolving includes.
+ namespace (dict | None): Existing namespace to merge content into.
+
+ Returns:
+ dict: Loaded LookML model with resolved includes.
+
+ Raises:
+ FileNotFoundError: If specified file path cannot be found.
+ ValueError: If LookML file cannot be parsed.
+
+ Example:
+ >>> namespace = file_loader('models/*.lkml', '/path/to/root')
+ >>> print(namespace['views'][0]['name'])
+ 'my_view'
+ """
file_paths = glob.glob(file_path_input)
for file_path in file_paths:
@@ -65,6 +99,21 @@ def write_single_file(
subdir: str = "cubes",
file_name: str = "my_cubes.yml",
):
+ """Write a single cube definition to a YAML file.
+
+ Args:
+ cube_def (dict): Cube definition to write.
+ outputdir (str): Output directory path.
+ subdir (str, optional): Subdirectory within output directory. Defaults to "cubes".
+ file_name (str, optional): Name of the output file. Defaults to "my_cubes.yml".
+
+ Raises:
+ OSError: If output directory cannot be created or file cannot be written.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders', 'sql_table': 'orders'}]}
+ >>> write_single_file(cube_def, '/output', 'cubes', 'orders.yml')
+ """
f = open(join(outputdir, subdir, file_name), "w")
f.write(yaml.dump(cube_def, allow_unicode=True))
@@ -72,6 +121,26 @@ def write_single_file(
def write_files(cube_def, outputdir):
+ """Write cube definitions to separate files organized by type.
+
+ Args:
+ cube_def (dict): Cube definitions containing 'cubes' and/or 'views' keys.
+ outputdir (str): Output directory path.
+
+ Returns:
+ dict: Summary of written files with structure:
+ {'cubes': [{'name': str, 'path': str}], 'views': [{'name': str, 'path': str}]}
+
+ Raises:
+ Exception: If no cube definition is provided.
+ OSError: If output directory cannot be created or files cannot be written.
+
+ Example:
+ >>> cube_def = {'cubes': [{'name': 'orders'}], 'views': [{'name': 'orders_view'}]}
+ >>> summary = write_files(cube_def, '/output')
+ >>> print(summary['cubes'][0]['name'])
+ 'orders'
+ """
summary = {"cubes": [], "views": []}
@@ -128,8 +197,25 @@ def write_files(cube_def, outputdir):
def write_lookml_files(lookml_model, outputdir):
- """
- Write LookML model to files in the output directory.
+ """Write LookML model to files in the output directory.
+
+ Args:
+ lookml_model (dict): LookML model containing 'views' and/or 'explores' keys.
+ outputdir (str): Output directory path.
+
+ Returns:
+ dict: Summary of written files with structure:
+ {'views': [{'name': str, 'path': str}], 'explores': [{'name': str, 'path': str}]}
+
+ Raises:
+ Exception: If no LookML model is provided.
+ OSError: If output directory cannot be created or files cannot be written.
+
+ Example:
+ >>> lookml_model = {'views': [{'name': 'orders'}], 'explores': [{'name': 'orders_explore'}]}
+ >>> summary = write_lookml_files(lookml_model, '/output')
+ >>> print(summary['views'][0]['name'])
+ 'orders'
"""
summary = {"views": [], "explores": []}
@@ -169,8 +255,23 @@ def write_lookml_files(lookml_model, outputdir):
def _generate_lookml_content(element, element_type, includes=None):
- """
- Generate LookML content for a view or explore element.
+ """Generate LookML content for a view or explore element.
+
+ Args:
+ element (dict): View or explore element definition.
+ element_type (str): Type of element ('view' or 'explore').
+ includes (list[str] | None): List of include statements to add.
+
+ Returns:
+ str: Generated LookML content as a string.
+
+ Example:
+ >>> element = {'name': 'orders', 'sql_table_name': 'orders'}
+ >>> content = _generate_lookml_content(element, 'view')
+ >>> print(content)
+ view orders {
+ sql_table_name: orders ;;
+ }
"""
lines = []
name = element.get("name", "unnamed")
@@ -247,8 +348,23 @@ def _generate_lookml_content(element, element_type, includes=None):
def _generate_dimension_lines(dimension):
- """
- Generate LookML lines for a dimension.
+ """Generate LookML lines for a dimension.
+
+ Args:
+ dimension (dict): Dimension definition containing name, type, sql, etc.
+
+ Returns:
+ list[str]: List of LookML lines representing the dimension.
+
+ Example:
+ >>> dimension = {'name': 'order_id', 'type': 'number', 'sql': '${TABLE}.id'}
+ >>> lines = _generate_dimension_lines(dimension)
+ >>> print('\n'.join(lines))
+ dimension: order_id {
+ type: number
+ primary_key: yes
+ sql: ${TABLE}.id ;;
+ }
"""
lines = []
name = dimension.get("name", "unnamed")
@@ -287,8 +403,23 @@ def _generate_dimension_lines(dimension):
def _generate_measure_lines(measure):
- """
- Generate LookML lines for a measure.
+ """Generate LookML lines for a measure.
+
+ Args:
+ measure (dict): Measure definition containing name, type, sql, etc.
+
+ Returns:
+ list[str]: List of LookML lines representing the measure.
+
+ Example:
+ >>> measure = {'name': 'total_orders', 'type': 'count', 'sql': '${TABLE}.id'}
+ >>> lines = _generate_measure_lines(measure)
+ >>> print('\n'.join(lines))
+ measure: total_orders {
+ type: count
+ sql: ${TABLE}.id ;;
+ drill_fields: [id, name]
+ }
"""
lines = []
name = measure.get("name", "unnamed")
@@ -325,8 +456,22 @@ def _generate_measure_lines(measure):
def _generate_filter_lines(filter_def):
- """
- Generate LookML lines for a filter.
+ """Generate LookML lines for a filter.
+
+ Args:
+ filter_def (dict): Filter definition containing name, type, description, etc.
+
+ Returns:
+ list[str]: List of LookML lines representing the filter.
+
+ Example:
+ >>> filter_def = {'name': 'date_filter', 'type': 'date', 'description': 'Filter by date'}
+ >>> lines = _generate_filter_lines(filter_def)
+ >>> print('\n'.join(lines))
+ filter: date_filter {
+ description: "Filter by date"
+ type: date
+ }
"""
lines = []
name = filter_def.get("name", "unnamed")
@@ -346,8 +491,23 @@ def _generate_filter_lines(filter_def):
def _generate_join_lines(join):
- """
- Generate LookML lines for a join.
+ """Generate LookML lines for a join.
+
+ Args:
+ join (dict): Join definition containing name, type, relationship, sql_on, etc.
+
+ Returns:
+ list[str]: List of LookML lines representing the join.
+
+ Example:
+ >>> join = {'name': 'customers', 'type': 'left_outer', 'relationship': 'many_to_one', 'sql_on': '${orders.customer_id} = ${customers.id}'}
+ >>> lines = _generate_join_lines(join)
+ >>> print('\n'.join(lines))
+ join: customers {
+ type: left_outer
+ relationship: many_to_one
+ sql_on: ${orders.customer_id} = ${customers.id} ;;
+ }
"""
lines = []
name = join.get("name", "unnamed")
@@ -370,8 +530,21 @@ def _generate_join_lines(join):
def _generate_includes_for_explore(explore, lookml_model):
- """
- Generate include statements for an explore based on the views it references.
+ """Generate include statements for an explore based on the views it references.
+
+ Args:
+ explore (dict): Explore definition containing view_name and joins.
+ lookml_model (dict): Complete LookML model to check for view existence.
+
+ Returns:
+ list[str]: List of include file paths for the explore.
+
+ Example:
+ >>> explore = {'name': 'orders', 'view_name': 'orders', 'joins': [{'name': 'customers'}]}
+ >>> lookml_model = {'views': [{'name': 'orders'}, {'name': 'customers'}]}
+ >>> includes = _generate_includes_for_explore(explore, lookml_model)
+ >>> print(includes)
+ ['/views/orders.view.lkml', '/views/customers.view.lkml']
"""
includes = []
referenced_views = set()
@@ -396,6 +569,18 @@ def _generate_includes_for_explore(explore, lookml_model):
def print_summary(summary):
+ """Print a formatted summary of generated files using Rich tables.
+
+ Args:
+ summary (dict): Summary dictionary containing file information with keys
+ 'cubes', 'views', and/or 'explores', each containing lists of
+ {'name': str, 'path': str} dictionaries.
+
+ Example:
+ >>> summary = {'cubes': [{'name': 'orders', 'path': '/output/cubes/orders.yml'}]}
+ >>> print_summary(summary)
+ # Displays a formatted table showing the generated files
+ """
# Use the proper Rich console for table rendering
rich_console = rich.console.Console()
diff --git a/lkml2cube/parser/types.py b/lkml2cube/parser/types.py
index 913f95b..aa7c9e1 100644
--- a/lkml2cube/parser/types.py
+++ b/lkml2cube/parser/types.py
@@ -3,7 +3,24 @@
# console = rich.console.Console()
class Console:
+ """Simple console wrapper for printing messages.
+
+ This class provides a basic print interface compatible with Rich console
+ while falling back to standard print functionality.
+ """
+
def print(self, s, *args):
+ """Print a message to the console.
+
+ Args:
+ s (str): Message to print.
+ *args: Additional arguments (currently ignored).
+
+ Example:
+ >>> console = Console()
+ >>> console.print("Hello world", style="bold")
+ Hello world
+ """
print(s)
@@ -39,16 +56,64 @@ def print(self, s, *args):
class folded_unicode(str):
+ """String subclass for YAML folded scalar representation.
+
+ This class marks strings that should be represented as folded scalars
+ in YAML output (using the '>' style).
+
+ Example:
+ >>> text = folded_unicode("This is a long\nstring that will be folded")
+ >>> # When dumped to YAML, will use '>' style
+ """
pass
class literal_unicode(str):
+ """String subclass for YAML literal scalar representation.
+
+ This class marks strings that should be represented as literal scalars
+ in YAML output (using the '|' style).
+
+ Example:
+ >>> sql = literal_unicode("SELECT *\nFROM table\nWHERE id = 1")
+ >>> # When dumped to YAML, will use '|' style preserving line breaks
+ """
pass
def folded_unicode_representer(dumper, data):
+ """YAML representer for folded_unicode strings.
+
+ Args:
+ dumper: YAML dumper instance.
+ data (folded_unicode): String data to represent.
+
+ Returns:
+ Scalar representation with folded style.
+
+ Example:
+ >>> import yaml
+ >>> yaml.add_representer(folded_unicode, folded_unicode_representer)
+ >>> yaml.dump(folded_unicode("long text"))
+ '> long text\n'
+ """
return dumper.represent_scalar("tag:yaml.org,2002:str", data, style=">")
def literal_unicode_representer(dumper, data):
+ """YAML representer for literal_unicode strings.
+
+ Args:
+ dumper: YAML dumper instance.
+ data (literal_unicode): String data to represent.
+
+ Returns:
+ Scalar representation with literal style.
+
+ Example:
+ >>> import yaml
+ >>> yaml.add_representer(literal_unicode, literal_unicode_representer)
+ >>> yaml.dump(literal_unicode("SELECT *\nFROM table"))
+ '|\n SELECT *\n FROM table\n'
+ """
return dumper.represent_scalar("tag:yaml.org,2002:str", data, style="|")
diff --git a/lkml2cube/parser/views.py b/lkml2cube/parser/views.py
index 4882f42..3476310 100644
--- a/lkml2cube/parser/views.py
+++ b/lkml2cube/parser/views.py
@@ -6,6 +6,37 @@
def parse_view(lookml_model, raise_when_views_not_present=True):
+ """Parse LookML views into Cube definitions.
+
+ Converts LookML view definitions into Cube format, handling dimensions, measures,
+ view inheritance, and various LookML-specific features like tiers and drill fields.
+
+ Args:
+ lookml_model (dict): LookML model containing views to parse.
+ raise_when_views_not_present (bool, optional): Whether to raise an exception
+ when no views are found. Defaults to True.
+
+ Returns:
+ dict: Cube definitions with structure:
+ {'cubes': [{'name': str, 'description': str, 'dimensions': list, 'measures': list, 'joins': list}]}
+
+ Raises:
+ Exception: If raise_when_views_not_present is True and no views are found,
+ or if required dimension properties are missing.
+
+ Example:
+ >>> lookml_model = {
+ ... 'views': [{
+ ... 'name': 'orders',
+ ... 'sql_table_name': 'public.orders',
+ ... 'dimensions': [{'name': 'id', 'type': 'number', 'sql': '${TABLE}.id'}],
+ ... 'measures': [{'name': 'count', 'type': 'count'}]
+ ... }]
+ ... }
+ >>> cube_def = parse_view(lookml_model)
+ >>> print(cube_def['cubes'][0]['name'])
+ 'orders'
+ """
cubes = []
cube_def = {"cubes": cubes}
rpl_table = lambda s: s.replace("${TABLE}", "{CUBE}").replace("${", "{")
diff --git a/pdm.lock b/pdm.lock
index 5d5ad8e..910b760 100644
--- a/pdm.lock
+++ b/pdm.lock
@@ -5,11 +5,46 @@
groups = ["default", "test"]
strategy = ["cross_platform"]
lock_version = "4.5.0"
-content_hash = "sha256:bcc1a3aab3bab7a4b535dec41ee5a87d23eb296cb705064437db21ca37daad76"
+content_hash = "sha256:8ea04e10efb4f16feba07d3681045b11e38fa781e11734a294f979de670883d6"
[[metadata.targets]]
requires_python = ">=3.10"
+[[package]]
+name = "black"
+version = "25.1.0"
+requires_python = ">=3.9"
+summary = "The uncompromising code formatter."
+dependencies = [
+ "click>=8.0.0",
+ "mypy-extensions>=0.4.3",
+ "packaging>=22.0",
+ "pathspec>=0.9.0",
+ "platformdirs>=2",
+ "tomli>=1.1.0; python_version < \"3.11\"",
+ "typing-extensions>=4.0.1; python_version < \"3.11\"",
+]
+files = [
+ {file = "black-25.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:759e7ec1e050a15f89b770cefbf91ebee8917aac5c20483bc2d80a6c3a04df32"},
+ {file = "black-25.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e519ecf93120f34243e6b0054db49c00a35f84f195d5bce7e9f5cfc578fc2da"},
+ {file = "black-25.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:055e59b198df7ac0b7efca5ad7ff2516bca343276c466be72eb04a3bcc1f82d7"},
+ {file = "black-25.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:db8ea9917d6f8fc62abd90d944920d95e73c83a5ee3383493e35d271aca872e9"},
+ {file = "black-25.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a39337598244de4bae26475f77dda852ea00a93bd4c728e09eacd827ec929df0"},
+ {file = "black-25.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96c1c7cd856bba8e20094e36e0f948718dc688dba4a9d78c3adde52b9e6c2299"},
+ {file = "black-25.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bce2e264d59c91e52d8000d507eb20a9aca4a778731a08cfff7e5ac4a4bb7096"},
+ {file = "black-25.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:172b1dbff09f86ce6f4eb8edf9dede08b1fce58ba194c87d7a4f1a5aa2f5b3c2"},
+ {file = "black-25.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4b60580e829091e6f9238c848ea6750efed72140b91b048770b64e74fe04908b"},
+ {file = "black-25.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e2978f6df243b155ef5fa7e558a43037c3079093ed5d10fd84c43900f2d8ecc"},
+ {file = "black-25.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3b48735872ec535027d979e8dcb20bf4f70b5ac75a8ea99f127c106a7d7aba9f"},
+ {file = "black-25.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:ea0213189960bda9cf99be5b8c8ce66bb054af5e9e861249cd23471bd7b0b3ba"},
+ {file = "black-25.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8f0b18a02996a836cc9c9c78e5babec10930862827b1b724ddfe98ccf2f2fe4f"},
+ {file = "black-25.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:afebb7098bfbc70037a053b91ae8437c3857482d3a690fefc03e9ff7aa9a5fd3"},
+ {file = "black-25.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:030b9759066a4ee5e5aca28c3c77f9c64789cdd4de8ac1df642c40b708be6171"},
+ {file = "black-25.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:a22f402b410566e2d1c950708c77ebf5ebd5d0d88a6a2e87c86d9fb48afa0d18"},
+ {file = "black-25.1.0-py3-none-any.whl", hash = "sha256:95e8176dae143ba9097f351d174fdaf0ccd29efb414b362ae3fd72bf0f710717"},
+ {file = "black-25.1.0.tar.gz", hash = "sha256:33496d5cd1222ad73391352b4ae8da15253c5de89b93a80b3e2c8d9a19ec2666"},
+]
+
[[package]]
name = "certifi"
version = "2025.7.14"
@@ -105,6 +140,102 @@ files = [
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
+[[package]]
+name = "databind"
+version = "4.5.2"
+requires_python = "<4.0.0,>=3.8.0"
+summary = "Databind is a library inspired by jackson-databind to de-/serialize Python dataclasses. The `databind` package will install the full suite of databind packages. Compatible with Python 3.8 and newer."
+dependencies = [
+ "Deprecated<2.0.0,>=1.2.12",
+ "nr-date<3.0.0,>=2.0.0",
+ "nr-stream<2.0.0,>=1.0.0",
+ "setuptools>=40.8.0; python_version < \"3.10\"",
+ "typeapi<3,>=2.0.1",
+ "typing-extensions<5,>=3.10.0",
+]
+files = [
+ {file = "databind-4.5.2-py3-none-any.whl", hash = "sha256:b9c3a03c0414aa4567f095d7218ac904bd2b267b58e3763dac28e83d64b69770"},
+ {file = "databind-4.5.2.tar.gz", hash = "sha256:0a8aa0ff130a0306581c559388f5ef65e0fae7ef4b86412eacb1f4a0420006c4"},
+]
+
+[[package]]
+name = "databind-core"
+version = "4.5.2"
+requires_python = "<4.0.0,>=3.8.0"
+summary = "Databind is a library inspired by jackson-databind to de-/serialize Python dataclasses. Compatible with Python 3.8 and newer. Deprecated, use `databind` package."
+dependencies = [
+ "databind<5.0.0,>=4.5.2",
+]
+files = [
+ {file = "databind.core-4.5.2-py3-none-any.whl", hash = "sha256:a1dd1c6bd8ca9907d1292d8df9ec763ce91543e27f7eda4268e4a1a84fcd1c42"},
+ {file = "databind.core-4.5.2.tar.gz", hash = "sha256:b8ac8127bc5d6b239a2a81aeddb268b0c4cadd53fbce7e8b2c7a9ef6413bccb3"},
+]
+
+[[package]]
+name = "databind-json"
+version = "4.5.2"
+requires_python = "<4.0.0,>=3.8.0"
+summary = "De-/serialize Python dataclasses to or from JSON payloads. Compatible with Python 3.8 and newer. Deprecated, use `databind` module instead."
+dependencies = [
+ "databind<5.0.0,>=4.5.2",
+]
+files = [
+ {file = "databind.json-4.5.2-py3-none-any.whl", hash = "sha256:a803bf440634685984361cb2a5a975887e487c854ed48d81ff7aaf3a1ed1e94c"},
+ {file = "databind.json-4.5.2.tar.gz", hash = "sha256:6cc9b5c6fddaebd49b2433932948eb3be8a41633b90aa37998d7922504b8f165"},
+]
+
+[[package]]
+name = "deprecated"
+version = "1.2.18"
+requires_python = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7"
+summary = "Python @deprecated decorator to deprecate old python classes, functions or methods."
+dependencies = [
+ "wrapt<2,>=1.10",
+]
+files = [
+ {file = "Deprecated-1.2.18-py2.py3-none-any.whl", hash = "sha256:bd5011788200372a32418f888e326a09ff80d0214bd961147cfed01b5c018eec"},
+ {file = "deprecated-1.2.18.tar.gz", hash = "sha256:422b6f6d859da6f2ef57857761bfb392480502a64c3028ca9bbe86085d72115d"},
+]
+
+[[package]]
+name = "docspec"
+version = "2.2.1"
+requires_python = ">=3.7,<4.0"
+summary = "Docspec is a JSON object specification for representing API documentation of programming languages."
+dependencies = [
+ "Deprecated<2.0.0,>=1.2.12",
+ "databind-core<5.0.0,>=4.2.6",
+ "databind-json<5.0.0,>=4.2.6",
+]
+files = [
+ {file = "docspec-2.2.1-py3-none-any.whl", hash = "sha256:7538f750095a9688c6980ff9a4e029a823a500f64bd00b6b4bdb27951feb31cb"},
+ {file = "docspec-2.2.1.tar.gz", hash = "sha256:4854e77edc0e2de40e785e57e95880f7095a05fe978f8b54cef7a269586e15ff"},
+]
+
+[[package]]
+name = "docspec-python"
+version = "2.2.2"
+requires_python = ">=3.8"
+summary = "A parser based on lib2to3 producing docspec data from Python source code."
+dependencies = [
+ "black>=24.8.0",
+ "docspec==2.2.1",
+ "nr-util>=0.8.12",
+]
+files = [
+ {file = "docspec_python-2.2.2-py3-none-any.whl", hash = "sha256:caa32dc1e8c470af8a5ecad67cca614e68c1563ac01dab0c0486c4d7f709d6b1"},
+ {file = "docspec_python-2.2.2.tar.gz", hash = "sha256:429be834d09549461b95bf45eb53c16859f3dfb3e9220408b3bfb12812ccb3fb"},
+]
+
+[[package]]
+name = "docstring-parser"
+version = "0.11"
+requires_python = ">=3.6"
+summary = "\"Parse Python docstrings in reST, Google and Numpydoc format\""
+files = [
+ {file = "docstring_parser-0.11.tar.gz", hash = "sha256:93b3f8f481c7d24e37c5d9f30293c89e2933fa209421c8abd731dd3ef0715ecb"},
+]
+
[[package]]
name = "exceptiongroup"
version = "1.3.0"
@@ -138,6 +269,19 @@ files = [
{file = "iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7"},
]
+[[package]]
+name = "jinja2"
+version = "3.1.6"
+requires_python = ">=3.7"
+summary = "A very fast and expressive template engine."
+dependencies = [
+ "MarkupSafe>=2.0",
+]
+files = [
+ {file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"},
+ {file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"},
+]
+
[[package]]
name = "lkml"
version = "1.3.7"
@@ -160,6 +304,65 @@ files = [
{file = "markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1"},
]
+[[package]]
+name = "markupsafe"
+version = "3.0.2"
+requires_python = ">=3.9"
+summary = "Safely add untrusted strings to HTML/XML markup."
+files = [
+ {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50"},
+ {file = "MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d"},
+ {file = "MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30"},
+ {file = "MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6"},
+ {file = "MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f"},
+ {file = "markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0"},
+]
+
[[package]]
name = "mdurl"
version = "0.1.2"
@@ -170,6 +373,53 @@ files = [
{file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
]
+[[package]]
+name = "mypy-extensions"
+version = "1.1.0"
+requires_python = ">=3.8"
+summary = "Type system extensions for programs checked with the mypy type checker."
+files = [
+ {file = "mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505"},
+ {file = "mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558"},
+]
+
+[[package]]
+name = "nr-date"
+version = "2.1.0"
+requires_python = ">=3.6,<4.0"
+summary = ""
+dependencies = [
+ "dataclasses<0.9,>=0.8; python_version == \"3.6\"",
+]
+files = [
+ {file = "nr_date-2.1.0-py3-none-any.whl", hash = "sha256:bd672a9dfbdcf7c4b9289fea6750c42490eaee08036a72059dcc78cb236ed568"},
+ {file = "nr_date-2.1.0.tar.gz", hash = "sha256:0643aea13bcdc2a8bc56af9d5e6a89ef244c9744a1ef00cdc735902ba7f7d2e6"},
+]
+
+[[package]]
+name = "nr-stream"
+version = "1.1.5"
+requires_python = ">=3.6,<4.0"
+summary = ""
+files = [
+ {file = "nr_stream-1.1.5-py3-none-any.whl", hash = "sha256:47e12150b331ad2cb729cfd9d2abd281c9949809729ba461c6aa87dd9927b2d4"},
+ {file = "nr_stream-1.1.5.tar.gz", hash = "sha256:eb0216c6bfc61a46d4568dba3b588502c610ec8ddef4ac98f3932a2bd7264f65"},
+]
+
+[[package]]
+name = "nr-util"
+version = "0.8.12"
+requires_python = ">=3.7,<4.0"
+summary = "General purpose Python utility library."
+dependencies = [
+ "deprecated<2.0.0,>=1.2.0",
+ "typing-extensions>=3.0.0",
+]
+files = [
+ {file = "nr.util-0.8.12-py3-none-any.whl", hash = "sha256:91da02ac9795eb8e015372275c1efe54bac9051231ee9b0e7e6f96b0b4e7d2bb"},
+ {file = "nr.util-0.8.12.tar.gz", hash = "sha256:a4549c2033d99d2f0379b3f3d233fd2a8ade286bbf0b3ad0cc7cea16022214f4"},
+]
+
[[package]]
name = "packaging"
version = "25.0"
@@ -180,6 +430,26 @@ files = [
{file = "packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f"},
]
+[[package]]
+name = "pathspec"
+version = "0.12.1"
+requires_python = ">=3.8"
+summary = "Utility library for gitignore style pattern matching of file paths."
+files = [
+ {file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"},
+ {file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"},
+]
+
+[[package]]
+name = "platformdirs"
+version = "4.3.8"
+requires_python = ">=3.9"
+summary = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`."
+files = [
+ {file = "platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4"},
+ {file = "platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc"},
+]
+
[[package]]
name = "pluggy"
version = "1.6.0"
@@ -190,6 +460,32 @@ files = [
{file = "pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3"},
]
+[[package]]
+name = "pydoc-markdown"
+version = "4.8.2"
+requires_python = ">=3.7,<4.0"
+summary = "Create Python API documentation in Markdown format."
+dependencies = [
+ "PyYAML<7.0,>=5.0",
+ "click<9.0,>=7.1",
+ "databind-core<5.0.0,>=4.4.0",
+ "databind-json<5.0.0,>=4.4.0",
+ "docspec-python<3.0.0,>=2.2.1",
+ "docspec<3.0.0,>=2.2.1",
+ "docstring-parser<0.12,>=0.11",
+ "jinja2<4.0.0,>=3.0.0",
+ "nr-util<1.0.0,>=0.7.5",
+ "requests<3.0.0,>=2.23.0",
+ "tomli-w<2.0.0,>=1.0.0",
+ "tomli<3.0.0,>=2.0.0",
+ "watchdog",
+ "yapf>=0.30.0",
+]
+files = [
+ {file = "pydoc_markdown-4.8.2-py3-none-any.whl", hash = "sha256:203f74119e6bb2f9deba43d452422de7c8ec31955b61e0620fa4dd8c2611715f"},
+ {file = "pydoc_markdown-4.8.2.tar.gz", hash = "sha256:fb6c927e31386de17472d42f9bd3d3be2905977d026f6216881c65145aa67f0b"},
+]
+
[[package]]
name = "pygments"
version = "2.19.2"
@@ -345,6 +641,29 @@ files = [
{file = "tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff"},
]
+[[package]]
+name = "tomli-w"
+version = "1.2.0"
+requires_python = ">=3.9"
+summary = "A lil' TOML writer"
+files = [
+ {file = "tomli_w-1.2.0-py3-none-any.whl", hash = "sha256:188306098d013b691fcadc011abd66727d3c414c571bb01b1a174ba8c983cf90"},
+ {file = "tomli_w-1.2.0.tar.gz", hash = "sha256:2dd14fac5a47c27be9cd4c976af5a12d87fb1f0b4512f81d69cce3b35ae25021"},
+]
+
+[[package]]
+name = "typeapi"
+version = "2.2.4"
+requires_python = ">=3.8"
+summary = ""
+dependencies = [
+ "typing-extensions>=3.0.0",
+]
+files = [
+ {file = "typeapi-2.2.4-py3-none-any.whl", hash = "sha256:bd6d5e5907fa47e0303bf254e7cc8712d4be4eb26d7ffaedb67c9e7844c53bb8"},
+ {file = "typeapi-2.2.4.tar.gz", hash = "sha256:daa80767520c0957a320577e4f729c0ba6921c708def31f4c6fd8d611908fd7b"},
+]
+
[[package]]
name = "typer"
version = "0.16.0"
@@ -394,3 +713,115 @@ files = [
{file = "urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc"},
{file = "urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760"},
]
+
+[[package]]
+name = "watchdog"
+version = "6.0.0"
+requires_python = ">=3.9"
+summary = "Filesystem events monitoring"
+files = [
+ {file = "watchdog-6.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d1cdb490583ebd691c012b3d6dae011000fe42edb7a82ece80965b42abd61f26"},
+ {file = "watchdog-6.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:bc64ab3bdb6a04d69d4023b29422170b74681784ffb9463ed4870cf2f3e66112"},
+ {file = "watchdog-6.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c897ac1b55c5a1461e16dae288d22bb2e412ba9807df8397a635d88f671d36c3"},
+ {file = "watchdog-6.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6eb11feb5a0d452ee41f824e271ca311a09e250441c262ca2fd7ebcf2461a06c"},
+ {file = "watchdog-6.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ef810fbf7b781a5a593894e4f439773830bdecb885e6880d957d5b9382a960d2"},
+ {file = "watchdog-6.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:afd0fe1b2270917c5e23c2a65ce50c2a4abb63daafb0d419fde368e272a76b7c"},
+ {file = "watchdog-6.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:bdd4e6f14b8b18c334febb9c4425a878a2ac20efd1e0b231978e7b150f92a948"},
+ {file = "watchdog-6.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c7c15dda13c4eb00d6fb6fc508b3c0ed88b9d5d374056b239c4ad1611125c860"},
+ {file = "watchdog-6.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6f10cb2d5902447c7d0da897e2c6768bca89174d0c6e1e30abec5421af97a5b0"},
+ {file = "watchdog-6.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:490ab2ef84f11129844c23fb14ecf30ef3d8a6abafd3754a6f75ca1e6654136c"},
+ {file = "watchdog-6.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:76aae96b00ae814b181bb25b1b98076d5fc84e8a53cd8885a318b42b6d3a5134"},
+ {file = "watchdog-6.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a175f755fc2279e0b7312c0035d52e27211a5bc39719dd529625b1930917345b"},
+ {file = "watchdog-6.0.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:c7ac31a19f4545dd92fc25d200694098f42c9a8e391bc00bdd362c5736dbf881"},
+ {file = "watchdog-6.0.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:9513f27a1a582d9808cf21a07dae516f0fab1cf2d7683a742c498b93eedabb11"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7607498efa04a3542ae3e05e64da8202e58159aa1fa4acddf7678d34a35d4f13"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:9041567ee8953024c83343288ccc458fd0a2d811d6a0fd68c4c22609e3490379"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:82dc3e3143c7e38ec49d61af98d6558288c415eac98486a5c581726e0737c00e"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:212ac9b8bf1161dc91bd09c048048a95ca3a4c4f5e5d4a7d1b1a7d5752a7f96f"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:e3df4cbb9a450c6d49318f6d14f4bbc80d763fa587ba46ec86f99f9e6876bb26"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:2cce7cfc2008eb51feb6aab51251fd79b85d9894e98ba847408f662b3395ca3c"},
+ {file = "watchdog-6.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:20ffe5b202af80ab4266dcd3e91aae72bf2da48c0d33bdb15c66658e685e94e2"},
+ {file = "watchdog-6.0.0-py3-none-win32.whl", hash = "sha256:07df1fdd701c5d4c8e55ef6cf55b8f0120fe1aef7ef39a1c6fc6bc2e606d517a"},
+ {file = "watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680"},
+ {file = "watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f"},
+ {file = "watchdog-6.0.0.tar.gz", hash = "sha256:9ddf7c82fda3ae8e24decda1338ede66e1c99883db93711d8fb941eaa2d8c282"},
+]
+
+[[package]]
+name = "wrapt"
+version = "1.17.2"
+requires_python = ">=3.8"
+summary = "Module for decorators, wrappers and monkey patching."
+files = [
+ {file = "wrapt-1.17.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3d57c572081fed831ad2d26fd430d565b76aa277ed1d30ff4d40670b1c0dd984"},
+ {file = "wrapt-1.17.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b5e251054542ae57ac7f3fba5d10bfff615b6c2fb09abeb37d2f1463f841ae22"},
+ {file = "wrapt-1.17.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:80dd7db6a7cb57ffbc279c4394246414ec99537ae81ffd702443335a61dbf3a7"},
+ {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a6e821770cf99cc586d33833b2ff32faebdbe886bd6322395606cf55153246c"},
+ {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b60fb58b90c6d63779cb0c0c54eeb38941bae3ecf7a73c764c52c88c2dcb9d72"},
+ {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b870b5df5b71d8c3359d21be8f0d6c485fa0ebdb6477dda51a1ea54a9b558061"},
+ {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4011d137b9955791f9084749cba9a367c68d50ab8d11d64c50ba1688c9b457f2"},
+ {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:1473400e5b2733e58b396a04eb7f35f541e1fb976d0c0724d0223dd607e0f74c"},
+ {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3cedbfa9c940fdad3e6e941db7138e26ce8aad38ab5fe9dcfadfed9db7a54e62"},
+ {file = "wrapt-1.17.2-cp310-cp310-win32.whl", hash = "sha256:582530701bff1dec6779efa00c516496968edd851fba224fbd86e46cc6b73563"},
+ {file = "wrapt-1.17.2-cp310-cp310-win_amd64.whl", hash = "sha256:58705da316756681ad3c9c73fd15499aa4d8c69f9fd38dc8a35e06c12468582f"},
+ {file = "wrapt-1.17.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ff04ef6eec3eee8a5efef2401495967a916feaa353643defcc03fc74fe213b58"},
+ {file = "wrapt-1.17.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4db983e7bca53819efdbd64590ee96c9213894272c776966ca6306b73e4affda"},
+ {file = "wrapt-1.17.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9abc77a4ce4c6f2a3168ff34b1da9b0f311a8f1cfd694ec96b0603dff1c79438"},
+ {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b929ac182f5ace000d459c59c2c9c33047e20e935f8e39371fa6e3b85d56f4a"},
+ {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f09b286faeff3c750a879d336fb6d8713206fc97af3adc14def0cdd349df6000"},
+ {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7ed2d9d039bd41e889f6fb9364554052ca21ce823580f6a07c4ec245c1f5d6"},
+ {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:129a150f5c445165ff941fc02ee27df65940fcb8a22a61828b1853c98763a64b"},
+ {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1fb5699e4464afe5c7e65fa51d4f99e0b2eadcc176e4aa33600a3df7801d6662"},
+ {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9a2bce789a5ea90e51a02dfcc39e31b7f1e662bc3317979aa7e5538e3a034f72"},
+ {file = "wrapt-1.17.2-cp311-cp311-win32.whl", hash = "sha256:4afd5814270fdf6380616b321fd31435a462019d834f83c8611a0ce7484c7317"},
+ {file = "wrapt-1.17.2-cp311-cp311-win_amd64.whl", hash = "sha256:acc130bc0375999da18e3d19e5a86403667ac0c4042a094fefb7eec8ebac7cf3"},
+ {file = "wrapt-1.17.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:d5e2439eecc762cd85e7bd37161d4714aa03a33c5ba884e26c81559817ca0925"},
+ {file = "wrapt-1.17.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fc7cb4c1c744f8c05cd5f9438a3caa6ab94ce8344e952d7c45a8ed59dd88392"},
+ {file = "wrapt-1.17.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8fdbdb757d5390f7c675e558fd3186d590973244fab0c5fe63d373ade3e99d40"},
+ {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5bb1d0dbf99411f3d871deb6faa9aabb9d4e744d67dcaaa05399af89d847a91d"},
+ {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d18a4865f46b8579d44e4fe1e2bcbc6472ad83d98e22a26c963d46e4c125ef0b"},
+ {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc570b5f14a79734437cb7b0500376b6b791153314986074486e0b0fa8d71d98"},
+ {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6d9187b01bebc3875bac9b087948a2bccefe464a7d8f627cf6e48b1bbae30f82"},
+ {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9e8659775f1adf02eb1e6f109751268e493c73716ca5761f8acb695e52a756ae"},
+ {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e8b2816ebef96d83657b56306152a93909a83f23994f4b30ad4573b00bd11bb9"},
+ {file = "wrapt-1.17.2-cp312-cp312-win32.whl", hash = "sha256:468090021f391fe0056ad3e807e3d9034e0fd01adcd3bdfba977b6fdf4213ea9"},
+ {file = "wrapt-1.17.2-cp312-cp312-win_amd64.whl", hash = "sha256:ec89ed91f2fa8e3f52ae53cd3cf640d6feff92ba90d62236a81e4e563ac0e991"},
+ {file = "wrapt-1.17.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6ed6ffac43aecfe6d86ec5b74b06a5be33d5bb9243d055141e8cabb12aa08125"},
+ {file = "wrapt-1.17.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:35621ae4c00e056adb0009f8e86e28eb4a41a4bfa8f9bfa9fca7d343fe94f998"},
+ {file = "wrapt-1.17.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a604bf7a053f8362d27eb9fefd2097f82600b856d5abe996d623babd067b1ab5"},
+ {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cbabee4f083b6b4cd282f5b817a867cf0b1028c54d445b7ec7cfe6505057cf8"},
+ {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49703ce2ddc220df165bd2962f8e03b84c89fee2d65e1c24a7defff6f988f4d6"},
+ {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8112e52c5822fc4253f3901b676c55ddf288614dc7011634e2719718eaa187dc"},
+ {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9fee687dce376205d9a494e9c121e27183b2a3df18037f89d69bd7b35bcf59e2"},
+ {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:18983c537e04d11cf027fbb60a1e8dfd5190e2b60cc27bc0808e653e7b218d1b"},
+ {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:703919b1633412ab54bcf920ab388735832fdcb9f9a00ae49387f0fe67dad504"},
+ {file = "wrapt-1.17.2-cp313-cp313-win32.whl", hash = "sha256:abbb9e76177c35d4e8568e58650aa6926040d6a9f6f03435b7a522bf1c487f9a"},
+ {file = "wrapt-1.17.2-cp313-cp313-win_amd64.whl", hash = "sha256:69606d7bb691b50a4240ce6b22ebb319c1cfb164e5f6569835058196e0f3a845"},
+ {file = "wrapt-1.17.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:4a721d3c943dae44f8e243b380cb645a709ba5bd35d3ad27bc2ed947e9c68192"},
+ {file = "wrapt-1.17.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:766d8bbefcb9e00c3ac3b000d9acc51f1b399513f44d77dfe0eb026ad7c9a19b"},
+ {file = "wrapt-1.17.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e496a8ce2c256da1eb98bd15803a79bee00fc351f5dfb9ea82594a3f058309e0"},
+ {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d615e4fe22f4ad3528448c193b218e077656ca9ccb22ce2cb20db730f8d306"},
+ {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a5aaeff38654462bc4b09023918b7f21790efb807f54c000a39d41d69cf552cb"},
+ {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a7d15bbd2bc99e92e39f49a04653062ee6085c0e18b3b7512a4f2fe91f2d681"},
+ {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e3890b508a23299083e065f435a492b5435eba6e304a7114d2f919d400888cc6"},
+ {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:8c8b293cd65ad716d13d8dd3624e42e5a19cc2a2f1acc74b30c2c13f15cb61a6"},
+ {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c82b8785d98cdd9fed4cac84d765d234ed3251bd6afe34cb7ac523cb93e8b4f"},
+ {file = "wrapt-1.17.2-cp313-cp313t-win32.whl", hash = "sha256:13e6afb7fe71fe7485a4550a8844cc9ffbe263c0f1a1eea569bc7091d4898555"},
+ {file = "wrapt-1.17.2-cp313-cp313t-win_amd64.whl", hash = "sha256:eaf675418ed6b3b31c7a989fd007fa7c3be66ce14e5c3b27336383604c9da85c"},
+ {file = "wrapt-1.17.2-py3-none-any.whl", hash = "sha256:b18f2d1533a71f069c7f82d524a52599053d4c7166e9dd374ae2136b7f40f7c8"},
+ {file = "wrapt-1.17.2.tar.gz", hash = "sha256:41388e9d4d1522446fe79d3213196bd9e3b301a336965b9e27ca2788ebd122f3"},
+]
+
+[[package]]
+name = "yapf"
+version = "0.43.0"
+requires_python = ">=3.7"
+summary = "A formatter for Python code"
+dependencies = [
+ "platformdirs>=3.5.1",
+ "tomli>=2.0.1; python_version < \"3.11\"",
+]
+files = [
+ {file = "yapf-0.43.0-py3-none-any.whl", hash = "sha256:224faffbc39c428cb095818cf6ef5511fdab6f7430a10783fdfb292ccf2852ca"},
+ {file = "yapf-0.43.0.tar.gz", hash = "sha256:00d3aa24bfedff9420b2e0d5d9f5ab6d9d4268e72afbf59bb3fa542781d5218e"},
+]
diff --git a/pyproject.toml b/pyproject.toml
index 89d0839..4fd9d6f 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -12,6 +12,7 @@ dependencies = [
"pyyaml>=6.0.1",
"rich>=13.7.1",
"requests>=2.32.4",
+ "pydoc-markdown>=4.8.2",
]
requires-python = ">=3.10"
readme = "README.md"
diff --git a/requirements.txt b/requirements.txt
index 805ed63..7291302 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -3,3 +3,4 @@ lkml>=1.3.1
pyyaml>=6.0.1
rich>=13.7.1
requests>=2.32.4
+pydoc-markdown>=4.8.2
diff --git a/scripts/generate_docs.py b/scripts/generate_docs.py
new file mode 100644
index 0000000..e49b98d
--- /dev/null
+++ b/scripts/generate_docs.py
@@ -0,0 +1,105 @@
+# scripts/generate_docs.py
+import os
+import subprocess
+import yaml
+import sys
+import importlib
+import inspect
+import pkgutil
+
+
+def generate_module_docs():
+ """Generate Markdown documentation for all Python modules."""
+
+ # Ensure docs directory exists
+ os.makedirs("docs", exist_ok=True)
+
+ # Add current directory to Python path
+ current_dir = os.path.abspath(".")
+ if current_dir not in sys.path:
+ sys.path.insert(0, current_dir)
+
+ # Try both pydoc-markdown and fallback to manual generation
+ modules = [
+ "lkml2cube.parser.cube_api",
+ "lkml2cube.parser.explores",
+ "lkml2cube.parser.loader",
+ "lkml2cube.parser.types",
+ "lkml2cube.parser.views",
+ "lkml2cube.converter",
+ "lkml2cube.main",
+ ]
+
+ for module in modules:
+ output_file = f"docs/{module.replace('.', '_')}.md"
+ print(f"Generating documentation for {module}...")
+
+ # First try pydoc-markdown
+ if try_pydoc_markdown(module, output_file):
+ print(f" ✓ Created {output_file} using pydoc-markdown")
+ else:
+ print(f" ✗ Error generating docs for {module}")
+
+
+def try_pydoc_markdown(module, output_file):
+ """Try to generate documentation using pydoc-markdown."""
+ try:
+ # Create a temporary config file for this module
+ config = {
+ "loaders": [{"type": "python", "search_path": ["."], "modules": [module]}],
+ "renderer": {
+ "type": "markdown",
+ "render_toc": True,
+ "render_module_header": True,
+ "filename": output_file,
+ },
+ }
+
+ config_file = f".pydoc-markdown-{module.replace('.', '_')}.yml"
+ with open(config_file, "w") as f:
+ yaml.dump(config, f)
+
+ # Run pydoc-markdown with the config file
+ env = os.environ.copy()
+ env["PYTHONPATH"] = os.path.abspath(".") + ":" + env.get("PYTHONPATH", "")
+
+ result = subprocess.run(
+ ["pydoc-markdown", config_file], capture_output=True, text=True, env=env
+ )
+
+ # Clean up temporary config file
+ if os.path.exists(config_file):
+ os.remove(config_file)
+
+ if result.returncode == 0 and os.path.exists(output_file):
+ optimize_for_llm(output_file)
+ return True
+ else:
+ return False
+ except Exception:
+ return False
+
+
+def optimize_for_llm(filepath):
+ """Compress documentation for optimal LLM consumption."""
+ with open(filepath, "r") as f:
+ content = f.read()
+
+ # Remove redundant phrases
+ optimizations = [
+ ("This function ", ""),
+ ("This method ", ""),
+ ("is used to ", ""),
+ ("is responsible for ", ""),
+ ("The purpose of this ", ""),
+ ]
+
+ for old, new in optimizations:
+ content = content.replace(old, new)
+
+ with open(filepath, "w") as f:
+ f.write(content)
+
+
+if __name__ == "__main__":
+ generate_module_docs()
diff --git a/tests/test_converter.py b/tests/test_converter.py
new file mode 100644
index 0000000..df739d5
--- /dev/null
+++ b/tests/test_converter.py
@@ -0,0 +1,615 @@
+"""
+Unit tests for the LookMLConverter class.
+
+This module contains comprehensive tests for the LookMLConverter class,
+including initialization, configuration management, and all core conversion methods.
+"""
+
+import pytest
+import yaml
+import tempfile
+from pathlib import Path
+from unittest.mock import patch, Mock, mock_open
+from os.path import join, dirname
+
+from lkml2cube.converter import LookMLConverter
+
+
+class TestLookMLConverterInitialization:
+ """Test LookMLConverter initialization and configuration."""
+
+ def test_default_initialization(self):
+ """Test LookMLConverter with default parameters."""
+ converter = LookMLConverter()
+
+ assert converter.outputdir == "."
+ assert converter.rootdir is None
+ assert converter.parseonly is False
+ assert converter.printonly is False
+ assert converter.use_explores_name is False
+
+ def test_custom_initialization(self):
+ """Test LookMLConverter with custom parameters."""
+ converter = LookMLConverter(
+ outputdir="/tmp/output",
+ rootdir="/tmp/root",
+ parseonly=True,
+ printonly=True,
+ use_explores_name=True
+ )
+
+ assert converter.outputdir == "/tmp/output"
+ assert converter.rootdir == "/tmp/root"
+ assert converter.parseonly is True
+ assert converter.printonly is True
+ assert converter.use_explores_name is True
+
+ def test_string_representation(self):
+ """Test __repr__ method."""
+ converter = LookMLConverter(outputdir="/tmp")
+ repr_str = repr(converter)
+
+ assert "LookMLConverter" in repr_str
+ assert "outputdir='/tmp'" in repr_str
+ assert "rootdir='None'" in repr_str
+ assert "parseonly=False" in repr_str
+
+
+class TestLookMLConverterConfiguration:
+ """Test configuration management methods."""
+
+ def test_get_config(self):
+ """Test get_config method."""
+ converter = LookMLConverter(outputdir="/tmp", parseonly=True)
+ config = converter.get_config()
+
+ expected_config = {
+ 'outputdir': '/tmp',
+ 'rootdir': None,
+ 'parseonly': True,
+ 'printonly': False,
+ 'use_explores_name': False
+ }
+
+ assert config == expected_config
+
+ def test_set_config_partial(self):
+ """Test set_config with partial updates."""
+ converter = LookMLConverter()
+
+ converter.set_config(outputdir="/new/path", parseonly=True)
+
+ assert converter.outputdir == "/new/path"
+ assert converter.parseonly is True
+ assert converter.printonly is False # Should remain unchanged
+
+ def test_set_config_all_params(self):
+ """Test set_config with all parameters."""
+ converter = LookMLConverter()
+
+ converter.set_config(
+ outputdir="/new/path",
+ rootdir="/new/root",
+ parseonly=True,
+ printonly=True,
+ use_explores_name=True
+ )
+
+ assert converter.outputdir == "/new/path"
+ assert converter.rootdir == "/new/root"
+ assert converter.parseonly is True
+ assert converter.printonly is True
+ assert converter.use_explores_name is True
+
+ def test_set_config_none_values(self):
+ """Test set_config with None values (should not change)."""
+ converter = LookMLConverter(outputdir="/original", parseonly=True)
+
+ converter.set_config(outputdir=None, parseonly=None)
+
+ assert converter.outputdir == "/original"
+ assert converter.parseonly is True
+
+
+class TestLookMLConverterCubes:
+ """Test the cubes() method."""
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_cubes_file_not_found(self, mock_file_loader):
+ """Test cubes method when file is not found."""
+ mock_file_loader.return_value = None
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="No files were found on path"):
+ converter.cubes("nonexistent.lkml")
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_cubes_parseonly(self, mock_file_loader):
+ """Test cubes method with parseonly=True."""
+ sample_lookml = {
+ 'views': [{'name': 'orders', 'sql_table_name': 'orders'}]
+ }
+ mock_file_loader.return_value = sample_lookml
+
+ converter = LookMLConverter(parseonly=True)
+ result = converter.cubes("test.lkml")
+
+ assert 'lookml_model' in result
+ assert 'parsed_model' in result
+ assert result['lookml_model'] == sample_lookml
+ assert isinstance(result['parsed_model'], str)
+
+ @patch('lkml2cube.converter.console')
+ @patch('lkml2cube.converter.yaml.dump')
+ @patch('lkml2cube.converter.generate_cube_joins')
+ @patch('lkml2cube.converter.parse_view')
+ @patch('lkml2cube.converter.file_loader')
+ def test_cubes_printonly(self, mock_file_loader, mock_parse_view,
+ mock_generate_joins, mock_yaml_dump, mock_console):
+ """Test cubes method with printonly=True."""
+ sample_lookml = {'views': [{'name': 'orders'}]}
+ sample_cube_def = {'cubes': [{'name': 'orders'}]}
+ yaml_output = "cubes:\n- name: orders\n"
+
+ mock_file_loader.return_value = sample_lookml
+ mock_parse_view.return_value = sample_cube_def
+ mock_generate_joins.return_value = sample_cube_def
+ mock_yaml_dump.return_value = yaml_output
+
+ converter = LookMLConverter(printonly=True)
+ result = converter.cubes("test.lkml")
+
+ assert 'lookml_model' in result
+ assert 'cube_def' in result
+ assert 'yaml_output' in result
+ assert result['yaml_output'] == yaml_output
+ mock_console.print.assert_called_once_with(yaml_output)
+
+ @patch('lkml2cube.converter.print_summary')
+ @patch('lkml2cube.converter.write_files')
+ @patch('lkml2cube.converter.generate_cube_joins')
+ @patch('lkml2cube.converter.parse_view')
+ @patch('lkml2cube.converter.file_loader')
+ def test_cubes_write_files(self, mock_file_loader, mock_parse_view,
+ mock_generate_joins, mock_write_files, mock_print_summary):
+ """Test cubes method with file writing."""
+ sample_lookml = {'views': [{'name': 'orders'}]}
+ sample_cube_def = {'cubes': [{'name': 'orders'}]}
+ sample_summary = {'cubes': [{'name': 'orders', 'path': '/tmp/orders.yml'}]}
+
+ mock_file_loader.return_value = sample_lookml
+ mock_parse_view.return_value = sample_cube_def
+ mock_generate_joins.return_value = sample_cube_def
+ mock_write_files.return_value = sample_summary
+
+ converter = LookMLConverter(outputdir="/tmp")
+ result = converter.cubes("test.lkml")
+
+ assert 'lookml_model' in result
+ assert 'cube_def' in result
+ assert 'summary' in result
+ assert result['summary'] == sample_summary
+
+ mock_write_files.assert_called_once_with(sample_cube_def, outputdir="/tmp")
+ mock_print_summary.assert_called_once_with(sample_summary)
+
+
+class TestLookMLConverterViews:
+ """Test the views() method."""
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_views_file_not_found(self, mock_file_loader):
+ """Test views method when file is not found."""
+ mock_file_loader.return_value = None
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="No files were found on path"):
+ converter.views("nonexistent.lkml")
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_views_parseonly(self, mock_file_loader):
+ """Test views method with parseonly=True."""
+ sample_lookml = {
+ 'views': [{'name': 'orders'}],
+ 'explores': [{'name': 'orders_explore'}]
+ }
+ mock_file_loader.return_value = sample_lookml
+
+ converter = LookMLConverter(parseonly=True)
+ result = converter.views("test.lkml")
+
+ assert 'lookml_model' in result
+ assert 'parsed_model' in result
+ assert result['lookml_model'] == sample_lookml
+
+ @patch('lkml2cube.converter.console')
+ @patch('lkml2cube.converter.yaml.dump')
+ @patch('lkml2cube.converter.parse_explores')
+ @patch('lkml2cube.converter.file_loader')
+ def test_views_printonly(self, mock_file_loader, mock_parse_explores,
+ mock_yaml_dump, mock_console):
+ """Test views method with printonly=True."""
+ sample_lookml = {'views': [{'name': 'orders'}], 'explores': []}
+ sample_cube_def = {'cubes': [{'name': 'orders'}], 'views': []}
+ yaml_output = "cubes:\n- name: orders\nviews: []\n"
+
+ mock_file_loader.return_value = sample_lookml
+ mock_parse_explores.return_value = sample_cube_def
+ mock_yaml_dump.return_value = yaml_output
+
+ converter = LookMLConverter(printonly=True, use_explores_name=True)
+ result = converter.views("test.lkml")
+
+ assert 'yaml_output' in result
+ assert result['yaml_output'] == yaml_output
+ mock_parse_explores.assert_called_once_with(sample_lookml, True)
+ mock_console.print.assert_called_once_with(yaml_output)
+
+ @patch('lkml2cube.converter.print_summary')
+ @patch('lkml2cube.converter.write_files')
+ @patch('lkml2cube.converter.parse_explores')
+ @patch('lkml2cube.converter.file_loader')
+ def test_views_write_files(self, mock_file_loader, mock_parse_explores,
+ mock_write_files, mock_print_summary):
+ """Test views method with file writing."""
+ sample_lookml = {'views': [{'name': 'orders'}], 'explores': []}
+ sample_cube_def = {'cubes': [{'name': 'orders'}], 'views': []}
+ sample_summary = {'cubes': [{'name': 'orders', 'path': '/tmp/orders.yml'}], 'views': []}
+
+ mock_file_loader.return_value = sample_lookml
+ mock_parse_explores.return_value = sample_cube_def
+ mock_write_files.return_value = sample_summary
+
+ converter = LookMLConverter(outputdir="/tmp", use_explores_name=False)
+ result = converter.views("test.lkml")
+
+ assert 'summary' in result
+ assert result['summary'] == sample_summary
+
+ mock_parse_explores.assert_called_once_with(sample_lookml, False)
+ mock_write_files.assert_called_once_with(sample_cube_def, outputdir="/tmp")
+
+
+class TestLookMLConverterExplores:
+ """Test the explores() method."""
+
+ @patch('lkml2cube.converter.meta_loader')
+ def test_explores_no_response(self, mock_meta_loader):
+ """Test explores method when no response is received."""
+ mock_meta_loader.return_value = None
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="No response received from"):
+ converter.explores("http://test.com/meta", "token")
+
+ @patch('lkml2cube.converter.meta_loader')
+ def test_explores_parseonly(self, mock_meta_loader):
+ """Test explores method with parseonly=True."""
+ sample_cube_model = {
+ 'cubes': [{'name': 'orders', 'sql_table': 'orders'}]
+ }
+ mock_meta_loader.return_value = sample_cube_model
+
+ converter = LookMLConverter(parseonly=True)
+ result = converter.explores("http://test.com/meta", "token")
+
+ assert 'cube_model' in result
+ assert 'parsed_model' in result
+ assert result['cube_model'] == sample_cube_model
+
+ @patch('lkml2cube.converter.console')
+ @patch('lkml2cube.converter.yaml.dump')
+ @patch('lkml2cube.converter.parse_meta')
+ @patch('lkml2cube.converter.meta_loader')
+ def test_explores_printonly(self, mock_meta_loader, mock_parse_meta,
+ mock_yaml_dump, mock_console):
+ """Test explores method with printonly=True."""
+ sample_cube_model = {'cubes': [{'name': 'orders'}]}
+ sample_lookml_model = {'views': [{'name': 'orders'}], 'explores': []}
+ yaml_output = "views:\n- name: orders\nexplores: []\n"
+
+ mock_meta_loader.return_value = sample_cube_model
+ mock_parse_meta.return_value = sample_lookml_model
+ mock_yaml_dump.return_value = yaml_output
+
+ converter = LookMLConverter(printonly=True)
+ result = converter.explores("http://test.com/meta", "token")
+
+ assert 'lookml_model' in result
+ assert 'yaml_output' in result
+ assert result['yaml_output'] == yaml_output
+ mock_console.print.assert_called_once_with(yaml_output)
+
+ @patch('lkml2cube.converter.print_summary')
+ @patch('lkml2cube.converter.write_lookml_files')
+ @patch('lkml2cube.converter.parse_meta')
+ @patch('lkml2cube.converter.meta_loader')
+ def test_explores_write_files(self, mock_meta_loader, mock_parse_meta,
+ mock_write_lookml_files, mock_print_summary):
+ """Test explores method with file writing."""
+ sample_cube_model = {'cubes': [{'name': 'orders'}]}
+ sample_lookml_model = {'views': [{'name': 'orders'}], 'explores': []}
+ sample_summary = {'views': [{'name': 'orders', 'path': '/tmp/orders.lkml'}], 'explores': []}
+
+ mock_meta_loader.return_value = sample_cube_model
+ mock_parse_meta.return_value = sample_lookml_model
+ mock_write_lookml_files.return_value = sample_summary
+
+ converter = LookMLConverter(outputdir="/tmp")
+ result = converter.explores("http://test.com/meta", "token")
+
+ assert 'lookml_model' in result
+ assert 'summary' in result
+ assert result['summary'] == sample_summary
+
+ mock_write_lookml_files.assert_called_once_with(sample_lookml_model, outputdir="/tmp")
+ mock_print_summary.assert_called_once_with(sample_summary)
+
+
+class TestLookMLConverterUtilities:
+ """Test utility methods."""
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_validate_files_all_valid(self, mock_file_loader):
+ """Test validate_files with all valid files."""
+ mock_file_loader.side_effect = [
+ {'views': [{'name': 'orders'}]},
+ {'views': [{'name': 'customers'}]}
+ ]
+
+ converter = LookMLConverter()
+ result = converter.validate_files(["orders.lkml", "customers.lkml"])
+
+ assert result == {"orders.lkml": True, "customers.lkml": True}
+ assert mock_file_loader.call_count == 2
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_validate_files_mixed_results(self, mock_file_loader):
+ """Test validate_files with mixed valid/invalid files."""
+ mock_file_loader.side_effect = [
+ {'views': [{'name': 'orders'}]},
+ None, # Invalid file
+ Exception("Parse error") # Exception
+ ]
+
+ converter = LookMLConverter()
+ result = converter.validate_files(["orders.lkml", "invalid.lkml", "error.lkml"])
+
+ assert result == {
+ "orders.lkml": True,
+ "invalid.lkml": False,
+ "error.lkml": False
+ }
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_validate_files_empty_list(self, mock_file_loader):
+ """Test validate_files with empty file list."""
+ converter = LookMLConverter()
+ result = converter.validate_files([])
+
+ assert result == {}
+ mock_file_loader.assert_not_called()
+
+
+class TestLookMLConverterIntegration:
+ """Integration tests using real sample files."""
+
+ def setup_method(self):
+ """Set up test fixtures."""
+ self.samples_dir = join(dirname(__file__), "samples")
+ # Clear the global visited_path cache to prevent interference between tests
+ from lkml2cube.parser import loader
+ loader.visited_path.clear()
+
+ def test_cubes_integration_with_sample_files(self):
+ """Test cubes method with actual sample files."""
+ converter = LookMLConverter(parseonly=True, rootdir=self.samples_dir)
+
+ # Use the sample orders view file (full path as expected by file_loader)
+ file_path = join(self.samples_dir, "lkml/views/orders.view.lkml")
+ result = converter.cubes(file_path)
+
+ assert 'lookml_model' in result
+ assert 'parsed_model' in result
+ assert result['lookml_model'] is not None
+ assert 'views' in result['lookml_model']
+ assert len(result['lookml_model']['views']) > 0
+
+ def test_views_integration_with_sample_files(self):
+ """Test views method with actual sample files."""
+ converter = LookMLConverter(parseonly=True, rootdir=self.samples_dir)
+
+ # Use the sample explores file (full path as expected by file_loader)
+ file_path = join(self.samples_dir, "lkml/explores/orders_summary.model.lkml")
+ result = converter.views(file_path)
+
+ assert 'lookml_model' in result
+ assert 'parsed_model' in result
+ assert result['lookml_model'] is not None
+
+ def test_validate_files_with_sample_files(self):
+ """Test validate_files with actual sample files."""
+ converter = LookMLConverter(rootdir=self.samples_dir)
+
+ # Use full paths as expected by file_loader
+ file_paths = [
+ join(self.samples_dir, "lkml/views/orders.view.lkml"),
+ join(self.samples_dir, "lkml/views/nonexistent.view.lkml")
+ ]
+
+ result = converter.validate_files(file_paths)
+
+ # First file should be valid, second should be invalid
+ assert len(result) == 2
+ assert result[file_paths[0]] == True
+ assert result[file_paths[1]] == False
+
+
+class TestLookMLConverterErrorHandling:
+ """Test error handling and edge cases."""
+
+ def test_cubes_with_invalid_file_path(self):
+ """Test cubes method with invalid file path."""
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="No files were found on path"):
+ converter.cubes("nonexistent_file.lkml")
+
+ def test_views_with_invalid_file_path(self):
+ """Test views method with invalid file path."""
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="No files were found on path"):
+ converter.views("nonexistent_file.lkml")
+
+ def test_explores_with_invalid_url(self):
+ """Test explores method with invalid URL."""
+ converter = LookMLConverter()
+
+ with pytest.raises(ValueError, match="A valid token must be provided"):
+ converter.explores("http://invalid.com/meta", "")
+
+ @patch('lkml2cube.converter.meta_loader')
+ def test_explores_with_api_error(self, mock_meta_loader):
+ """Test explores method with API error."""
+ mock_meta_loader.side_effect = Exception("API Error")
+ converter = LookMLConverter()
+
+ with pytest.raises(Exception, match="API Error"):
+ converter.explores("http://test.com/meta", "token")
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_cubes_with_parse_error(self, mock_file_loader):
+ """Test cubes method with parsing error."""
+ mock_file_loader.side_effect = Exception("Parse error")
+ converter = LookMLConverter()
+
+ with pytest.raises(Exception, match="Parse error"):
+ converter.cubes("test.lkml")
+
+ def test_set_config_with_invalid_types(self):
+ """Test set_config with various parameter types."""
+ converter = LookMLConverter()
+
+ # These should work without error
+ converter.set_config(outputdir="string")
+ converter.set_config(parseonly=True)
+ converter.set_config(parseonly=False)
+ converter.set_config(rootdir=None)
+
+ # Verify the values were set correctly
+ assert converter.outputdir == "string"
+ assert converter.parseonly is False
+ assert converter.rootdir is None
+
+
+class TestLookMLConverterYAMLConfiguration:
+ """Test YAML configuration setup."""
+
+ def test_yaml_representers_configured(self):
+ """Test that YAML representers are properly configured."""
+ from lkml2cube.parser.types import folded_unicode, literal_unicode
+
+ # Creating a converter should configure YAML representers
+ converter = LookMLConverter()
+
+ # Test that the representers work
+ folded_text = folded_unicode("This is folded text")
+ literal_text = literal_unicode("This is literal text")
+
+ # These should not raise errors
+ yaml_output = yaml.dump({"folded": folded_text, "literal": literal_text})
+ assert isinstance(yaml_output, str)
+
+ def test_multiple_converter_instances(self):
+ """Test that multiple converter instances work correctly."""
+ converter1 = LookMLConverter(outputdir="/tmp1")
+ converter2 = LookMLConverter(outputdir="/tmp2")
+
+ assert converter1.outputdir == "/tmp1"
+ assert converter2.outputdir == "/tmp2"
+
+ # Configuration should be independent
+ converter1.set_config(parseonly=True)
+ assert converter1.parseonly is True
+ assert converter2.parseonly is False
+
+
+class TestLookMLConverterCacheManagement:
+ """Test cache management functionality."""
+
+ def test_clear_cache_method_exists(self):
+ """Test that clear_cache method exists and can be called."""
+ converter = LookMLConverter()
+
+ # Method should exist and be callable
+ assert hasattr(converter, 'clear_cache')
+ assert callable(converter.clear_cache)
+
+ # Should not raise any errors
+ converter.clear_cache()
+
+ def test_clear_cache_actually_clears_visited_path(self):
+ """Test that clear_cache actually clears the visited_path cache."""
+ from lkml2cube.parser import loader
+
+ # Set up some dummy cache data
+ loader.visited_path['test_file.lkml'] = True
+ loader.visited_path['another_file.lkml'] = True
+
+ # Verify cache has data
+ assert len(loader.visited_path) == 2
+ assert 'test_file.lkml' in loader.visited_path
+ assert 'another_file.lkml' in loader.visited_path
+
+ # Clear cache using converter method
+ converter = LookMLConverter()
+ converter.clear_cache()
+
+ # Verify cache is cleared
+ assert len(loader.visited_path) == 0
+ assert 'test_file.lkml' not in loader.visited_path
+ assert 'another_file.lkml' not in loader.visited_path
+
+ def test_clear_cache_with_empty_cache(self):
+ """Test that clear_cache works when cache is already empty."""
+ from lkml2cube.parser import loader
+
+ # Ensure cache starts empty
+ loader.visited_path.clear()
+ assert len(loader.visited_path) == 0
+
+ # Clear cache - should not raise any errors
+ converter = LookMLConverter()
+ converter.clear_cache()
+
+ # Cache should still be empty
+ assert len(loader.visited_path) == 0
+
+ @patch('lkml2cube.converter.file_loader')
+ def test_clear_cache_integration_with_file_loading(self, mock_file_loader):
+ """Test that clear_cache works in integration with file loading operations."""
+ from lkml2cube.parser import loader
+
+ # Mock the file loader to return a simple model
+ sample_lookml = {'views': [{'name': 'test_view'}]}
+ mock_file_loader.return_value = sample_lookml
+
+ converter = LookMLConverter(parseonly=True)
+
+ # First call should populate cache (through file_loader)
+ converter.cubes("test.lkml")
+
+ # Manually add something to cache to simulate file_loader behavior
+ loader.visited_path['test.lkml'] = True
+ assert len(loader.visited_path) == 1
+
+ # Clear cache
+ converter.clear_cache()
+
+ # Cache should be empty
+ assert len(loader.visited_path) == 0
+
+ # Another call should work normally
+ result = converter.cubes("test.lkml")
+ assert result['lookml_model'] == sample_lookml
\ No newline at end of file
diff --git a/tests/test_e2e.py b/tests/test_e2e.py
index 93f080e..b4914a3 100644
--- a/tests/test_e2e.py
+++ b/tests/test_e2e.py
@@ -12,6 +12,12 @@
class TestExamples:
+ def setup_method(self):
+ """Set up test fixtures."""
+ # Clear the global visited_path cache to prevent interference between tests
+ from lkml2cube.parser import loader
+ loader.visited_path.clear()
+
def test_simple_view(self):
file_path = "lkml/views/orders.view.lkml"
# print(join(rootdir, file_path))