We might encounter failures while installing packages in serverless Apache Spark pool. I am going to discuss basic steps how should you troubleshoot the package installation failures in Apache Spark pool.
Before we discuss about them, please take a look at the page here which provides steps how to update packages/libraries (Jar/whl/etc) in Spark pool.
For this article I am taking an example to install wheel file under Spark packages. I uploaded the whl file and started execution as shown in below snapshot:
Settings are getting applied:
And unexpectedly, it failed. If you go through "view details", you might not be able to decode the exact error which can lead to further resolution step.
In order to find driver and executor errors, you need to jump to Monitor tab. By default, in Spark pool, package is installed by the application called SystemReservedJob-LibraryManagement . So we will traverse to Monitoring tab as below and check the application that failed as part of package installation.
Let's proceed further with below steps in order to collect driver stdout/stderr log :
Click on the application that failed --> Click on Spark History server -->
Next click on the application 1 or 2 and go to Executors tab :
Then switch to stdout/stderr of driver (Executor ID) to find the error. I found the error under stdout which could help me to resolve the issue:
Error found was :
This certainly says that your file needs to be built using a different upgraded Pandas version. You can work accordingly to make the installation successful after upgrading with different Pandas version.
Another such error found was:
Source : Troubleshoot library installation errors - Azure Synapse Analytics | Microsoft Docs
Happy troubleshooting Folks!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.