python (65.2k questions)
javascript (44.3k questions)
reactjs (22.7k questions)
java (20.8k questions)
c# (17.4k questions)
html (16.3k questions)
r (13.7k questions)
android (13k questions)
Dynamic allocation of spark 2 cluster resources to the running jobs
We have a spark 2 HDInsight cluster which has 650 GB and 195 Vcores. This is a 9 worker nodes and 2 head nodes cluster. The problem is that the jobs are not fully utilizing the cluster. For example wh...

TomG
Votes: 0
Answers: 1
How can i show hive table using pyspark
Hello i created a spark HD insight cluster on azure and i’m trying to read hive tables with pyspark but the proble that its show me only default database
Anyone have an idea ?
Amine Mokrani
Votes: 0
Answers: 3
Spark: I see more executors than available cluster's cores
I'm working with Spark and Yarn on an Azure HDInsight cluster, and I have some troubles on understanding the relations between the workers' resources, executors and containers.
My cluster has 10 worke...
andream
Votes: 0
Answers: 0
Spark cluster is not dynamically allocating resources to jobs
The cluster is HDInsight 4.0 and has 250 GB RAM and 75 VCores.
I am running only one job and the cluster is always allocating 66 GB, 7 VCores and 7 Containers to the job even though we have 250 GB and...

TomG
Votes: 0
Answers: 1