Blog Post

Azure Synapse Analytics Blog
2 MIN READ

Notebook - This request is not authorized to perform this operation. , 403

Liliam_C_Leme's avatar
Liliam_C_Leme
Icon for Microsoft rankMicrosoft
Sep 28, 2020

This a quick post about this failure and how to fix: Error: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException:
The operation failed: 'This request is not authorized to perform this operation.', 403

 

First, let's just add some context : 

 

When you are working on synapse workspace with the managed identity you would need to give Storage Blob Data contributor permission to the workspace that represents the managed identity permission:

https://docs.microsoft.com/en-us/azure/synapse-analytics/security/how-to-grant-workspace-managed-identity-permissions

More information here: https://docs.microsoft.com/en-us/azure/synapse-analytics/security/synapse-workspace-managed-identity

 

Speaking of managed identities -quick review on it: "A common challenge when building cloud applications is how to manage the credentials in your code for authenticating to cloud services. You can use the identity to authenticate to any service that supports Azure AD authentication, including Key Vault, without any credentials in your code."

More here: https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json

 

So there you are with your workspace with the managed identity permissions granted running a notebook to create a database on Spark.

Note you are the one running the notebook on Synapse Studio and you are not using any credentials as you are under Synapse workspace:

 

 

 

 

 

%%spark
spark.sql("CREATE DATABASE IF NOT EXISTS nyctaxi")

 

 

 

 

 

It fails:
Error : org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, HEAD, https://StorageAccountName.dfs.core.windows.net/ContainerName/tmp/hive?upn=false&action=getStatus&timeout=90;
Solution:
Add the RBAC Storage Blob Data Contributor to the user that is running the notebook, or your user.
Steps here: https://docs.microsoft.com/en-us/azure/synapse-analytics/security/how-to-grant-workspace-managed-identity-permissions (Grant permissions to managed identity after workspace creation)
 
In this case, the script is running under the user that is executing the notebook. So this user needs permission as well.
 
 
That is it!
Liliam Uk Engineer.

 

Updated Sep 28, 2020
Version 1.0

17 Comments

  • Hi ,

     

    Able to solve the issue of reading the data from ADLS gen2 by using the native capability of synapse to create a linked server connection to the file.

     

    This works same as the linked service from synapse pipeline. So we reuse those linked service to connection for reading the files.

    When the type of LS s managed identity , we can use the below code to read the file (this can be extended for json , parquet and other formats)

     

    %%pyspark

    # Python code

    spark.conf.set("spark.storage.synapse.linkedServiceName", "<lINKED SERVICE NAME>") spark.conf.set("fs.azure.account.oauth.provider.type", "com.microsoft.azure.synapse.tokenlibrary.LinkedServiceBasedTokenProvider") df = spark.read.csv('abfss://<CONTAINER>@<ACCOUNT>.dfs.core.windows.net/<DIRECTORY PATH>') df.show()

     

    Hope this helps.

  • Hi  @Liliam_Leme 

    I am trying to read a file in storage account from by Spark notebook.

     

    I am able to create Linked service to the storage , read and write files using copy data , but not able to read it from spark notebook.

     

    I gave  Synapse MSI and user group "Storage Blob Data Contributor access" on the storage

    Added Synapse workspace in the firewall rule. 

     

    Still getting the same error ""This request is not authorized to perform this operation.", 403"

  • pranp's avatar
    pranp
    Copper Contributor

    I am currently having the same problem. The Storage Account that it is trying to access was created together with Azure Synapse Analytics. I have the contributor permission for the Storage Account as well. The problem goes away when I change the firewall setting to 'All Network'. I have a private endpoint from the Storage Account set in the Managed Vnet too.

  • Liliam_C_Leme 

    1. I don't have VNet
    2. There are no firewall restrictions
    3. No Synapse and storage are not in same subscription
    4. Yes I can browse folder through Integration dataset created by Linked service
    5. Error is intermittent not consistent
      1. In most runs I get error and in some runs it succeeds
    6. I am connecting using SPN through spark conf:
      1. spark.conf.set("spark.storage.synapse.linkedServiceName", linkedServiceName)
        spark.conf.set("fs.azure.account.oauth.provider.type", "com.microsoft.azure.synapse.tokenlibrary.LinkedServiceBasedTokenProvider")
  • I get the same message running azcopy at the command line. I have authorized myself as owner of the container, storage account and the resource group and have logged into azcopy using the AD tenant, per instructions. It works fine when I use SAS but not with Active Directory authentication. Frankly, I don't see how it can work because the AD tenant it not tied to the container but those are the instructions.

     

  • Hi,GauravKhattar  do you have Vnet enable? is the error the same with or without the firewall?  Is this storage account on the same subscription? Can you connect on it with the Synapse linked server? is Synapse configured to pass through the firewall ( if it is enabled)? Connect to a secure storage account from your Azure Synapse workspace - Azure Synapse Analytics | Microsoft Docs

    Did you create this workspace on top of SQLDW ( former). If that is the case, can you try to add a private endpoint?

  • Hi Liliam_C_Leme 

    I have assigned following permissions:

    1. Myself and Synapse identity as Storage Blob Contributor
    2. Myself as Synapse Admin
    3. Added Myself and Synapse identity to ACL with Read, Write and Execute permissions on container

    Still I'm getting this error