Home
Blogs
Questions
Jobs
Monetize

Home

About Us

Blogs

Questions

Jobs

Monetize

Post Job

banner

Questions about apache-spark-2.0

Read more about apache-spark-2.0

python (65.2k questions)

javascript (44.3k questions)

reactjs (22.7k questions)

java (20.8k questions)

c# (17.4k questions)

html (16.3k questions)

r (13.7k questions)

android (13k questions)

Questions - apache-spark-2.0

Dynamic allocation of spark 2 cluster resources to the running jobs

We have a spark 2 HDInsight cluster which has 650 GB and 195 Vcores. This is a 9 worker nodes and 2 head nodes cluster. The problem is that the jobs are not fully utilizing the cluster. For example wh...
test-img

TomG

apache-spark

hadoop-yarn

azure-hdinsight

apache-spark-2.0

Votes: 0

Answers: 1

Latest Answer

I suggest that you change your yarn scheduler to Capacity scheduler. This is better at sharing resources. This will help you to ensure the resources are better shared. By default hadoop is 'First in ...
test-img

Matt Andruff

Posts

Questions

Blogs

Jobs

The ultimate platform for coders and IT specialists

About

  • Company
  • Support

  • Platform

  • Terms & Conditions
  • Privacy statement
  • Cookie policy
  • Cookie option
  • OnlyCoders © 2025  |  All rights reserved