Skip to content

labs.create_cloud_connection() Bug - Not able to authenticate when creating a connection against Lakehouse #1025

@DieselAnalytics

Description

@DieselAnalytics

Describe the bug
I am attempting to use labs.create_cloud_connection() to create connection to a lakehouse using the code below:

conn = labs.create_cloud_connection(
    name=f"CloudConn-{lakehouse_name}-SQL",
    server_name = lakehouse_sql_endpoint,
    database_name = lakehouse_name,
    user_name="placeholder",
    password="placeholder",    
    privacy_level="Organizational"
)

I get the following error when I try to execute the above code:

FabricHTTPException: 400 Bad Request for url: https://api.fabric.microsoft.com//v1/connections
Error: {"requestId":"013ee7de-9af8-43e2-b0f4-51e30989fb7d","errorCode":"DM_GWPipeline_Gateway_DataSourceAccessError","moreDetails":[{"errorCode":"DM_ErrorDetailNameCode_UnderlyingErrorCode","message":"-2146232060"},{"errorCode":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","message":"Could not login because the authentication failed."},{"errorCode":"DM_ErrorDetailNameCode_UnderlyingHResult","message":"-2146232060"},{"errorCode":"DM_ErrorDetailNameCode_UnderlyingNativeErrorCode","message":"18456"}],"message":"PowerBI service client received error HTTP response. HttpStatus: 400. PowerBIErrorCode: DM_GWPipeline_Gateway_DataSourceAccessError"}
Headers: {'Cache-Control': 'no-store, must-revalidate, no-cache', 'Pragma': 'no-cache', 'Transfer-Encoding': 'chunked', 'Content-Type': 'application/json; charset=utf-8', 'x-ms-public-api-error-code': 'DM_GWPipeline_Gateway_DataSourceAccessError', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains', 'X-Frame-Options': 'deny', 'X-Content-Type-Options': 'nosniff', 'RequestId': '013ee7de-9af8-43e2-b0f4-51e30989fb7d', 'Access-Control-Expose-Headers': 'RequestId', 'request-redirected': 'true', 'home-cluster-uri': 'https://wabi-west-us3-a-primary-redirect.analysis.windows.net/', 'Date': 'Wed, 07 Jan 2026 23:13:41 GMT'}

My ultimate goal is to do the following:

  1. Create an empty semantic model
  2. Add a connection based on a lakehouse to the semantic model created in Step 1. The connection will use the lakehouse's SQL endpoint and the semantic model will use import mode.
  3. Add the tables in the lakehosue to the semantic model in import mode

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions