Blog Post

Azure Synapse Analytics Blog
1 MIN READ

Livy is dead and some logs to help.

Liliam_C_Leme's avatar
Liliam_C_Leme
Icon for Microsoft rankMicrosoft
Aug 07, 2020

 A quick post about notebooks and spark. So basically a customer was running some spark jobs on synapse and the error was Livy is dead.

That is sad and also the customer was not sure, why is it dead??

 

So, we started through the logs available on this session of synapse studio:

https://docs.microsoft.com/en-us/azure/synapse-analytics/monitoring/apache-spark-applications

 

We checked the driver logs as exemplified by fig 1:

Fig 1: Driver Logs

 

We Spot this error ( note you have also Livy logs on this same session):

java.lang.OutOfMemoryError

 

The customer was running a job with a small node size with 4vCPus and 32 GB. When he processed a small amount of data it worked. If he increased the amount data to a certain point it would kill Livy. It sound like a limitation, but which limitation?

Regards the memory  we basically changed the pool to give him more room.

You can change the setting on the portal -> Synapse -> apache pool settings - Fig 2

Fig. 2 Apache Pool Settings

 

You can also adjust this configuration on the Synapse studio ( this is a session configuration) - fig 3:

 

Fig 3 - Pool Settings

 

That is something to be defined on the creation as you can see here: https://docs.microsoft.com/en-us/azure/synapse-analytics/quickstart-create-apache-spark-pool-portal

 

But you can adjust afterward.

 

That is it!

Liliam 

Uk Engineer

 

 

 

 

Updated Aug 17, 2020
Version 7.0
No CommentsBe the first to comment