Spark Notebook error: Java.sql.SQLException:User does not have permissions to perform this action

Published 07-24-2020 01:16 AM 1,787 Views

I was working on this case last week with permission error on the spark notebook, so basically the scenario was: 

1. Loading data from another database to DB container.

2. Loading Data from Datawarehouse using Spark Notebook

When the second step was executed the error bellow was throw:

Error: java.sql.SQLException: does not have permissions to perform this action


So the error message is pretty clear: This is a permission error. The solution was also simple as the message.

We created a SQL User on the Db for this process specific. As this process requires only data reader permission that was the one given to the user.

---Run on Master DB
CREATE LOGIN loginmame WITH PASSWORD = Lalala!0001' GO ---Run on SQL DW DB CREATE USER username FOR LOGIN loginname WITH DEFAULT_SCHEMA = dbo GO -- Add user to the database role EXEC sp_addrolemember N'db_datareader', N'username' GO GRANT CONNECT TO username;

After that we changed the notebook process to run using the SQL User/Password that we just created. As it follows.


Spark script using SQL User to be executed on the notebook (spark Scala):

val df =

option(Constants.SERVER, "").

option(Constants.USER, "user ").

option(Constants.PASSWORD, "password").


Also as figure  1 exemplifies:


That is it!


Liliam Leme

UK Engineer

Senior Member

And is this also possible with using a managed identity? It does not look like it:


You could use the AAD. You could also change the context of the execution using the notebook on the pipeline and than work in a different way for authentication. For the notebook let me look into it.


@Liliam_Leme @Johannes_Vink I ran into same error. When I added the Synapse Workspace Managed Identity as external user in the database and subsequently granted db_owner rights (instead of only db_datareader) to the external user, then I did not get this error. Not sure what role is minimally required for the external user, but db_datareader is not sufficient (also when you only need to read data in your pipeline)


@rebremer , thanks for sharing. but, were you reading data from SQLPool with managed identities on the notebook and db_datareader failed? Is that what happened? Or were you trying to read the data from the pipeline and the permission was not enough?

@Johannes_Vink  - I forgot to add more information about your question:

there are limitations for managed identities on SQL Pool documented here:

But the syntax would be like:"useAzureMSI","true")  - anyway, there is some discussion about this here and they updated the docs here:

Alternatively, if you use ADLS Gen2 + OAuth 2.0 authentication or your Azure Synapse instance is configured to have a Managed Service Identity (typically in conjunction with a VNet + Service Endpoints setup), you must set useAzureMSI to true. In this case the connector will specify IDENTITY = 'Managed Service Identity' for the databased scoped credential and no SECRET.


@Liliam_Leme , my setup was as follows:

- External SQLDW in which the Synapse workspace MI is added as external user

- Spark notebook in Synapse workspace that retrieves data from external SQLDW (and builds an ML model)

- Spark notebook is triggered using Synapse pipeline


The following happens:

- If I grant Synapse workspace MI user db_datareader rights in external SQLDW, then pipeline fails

- If I grant Synapse workspace MI user full db_owner rights in external SQLDW, then pipeline runs successfully



@rebremer interesting scenario. Let me check.

Version history
Last update:
‎Jul 24 2020 01:16 AM
Updated by: